The Facebook\n \n (FB) Oversight Board was designed to make some of Facebook\n \n (FB)’s most difficult decisions for the company. But on Wednesday the board put one of the biggest dilemmas facing the platform back on Facebook\n \n (FB) and company CEO Mark Zuckerberg. The board said Facebook was right to suspend Trump in the immediate aftermath of the January 6th insurrection, but said Facebook couldn’t just make the suspension “indefinite” with no actual rule on its books allowing for that. The board said Facebook must review the decision and figure out if Trump should be banned from the platform forever. The board could have made that decision itself. To some it might appear that, in not doing so, it was trying to pass the buck, not wanting to make a controversial call or test its authority. But by choosing to hand the decision back to Facebook, it put Zuckerberg’s powerful role in overseeing public discourse in the United States in the spotlight, along with the arbitrary nature of how Facebook moderates it platform. Forcing Facebook to make the decision was in itself an exercise of the board’s power and independence. The decision was in many ways not just about Trump. It may ultimately be more important for parts that were really only secondarily about him. The ruling was a shot across Facebook’s bow, warning the company that it has to get its house in order and can no longer make world-altering decisions on the fly. It also served as notice to people – including me – who doubted the board’s importance and thought it might serve as a rubber stamp for Facebook. Critics were concerned the board was there to make Facebook’s life easier; no one can say this decision did so. Facebook now has six months from Wednesday to decide Trump’s fate. When it does so, the board said, it needs to follow its own rules for what penalties can be meted out. This is no small thing. Much of content moderation by social networks, not just Facebook but all of them, is essentially arbitrary. The companies take action against some bad material they find, when they find it; on the other hand sometimes they take no action. There’s no guarantee of any evident rhyme or reason for why some content is judged worthy of removal and some is not. And the companies’ rules very often seem more like vagaries that can be applied to justify decisions they’ve made, rather than the things that guide decisions. With this decision the board told Facebook that is no longer OK. The board did one other thing that may push Facebook toward better, more thoughtful and less haphazard moderation. Though it did not mandate this, it did tell Facebook that the company needs to consider “a comprehensive review of Facebook’s potential contribution to the narrative of electoral fraud and the exacerbated tensions that culminated in the violence in the United States on January 6.” The board said such a review “should be an open reflection on the design and policy choices that Facebook has made that may allow its platform to be abused.” The message overall was clear: Facebook has long operated as if it’s fine to wait until January 7 to take action to prevent January 6. That time should now be over.