Solomon Messing's work sits at the intersection of social science and machine learning. In his current role at Twitter, he leads an organization working on some of the hardest problems at the company:
- How can we understand and reward the value people bring when they create content or connect with a colleague on the platform?
- How can we grapple with the short- and long-term tradeoffs between engagement and action against problematic discourse?
- How can we extract meaningful signal from messy, high-dimensional data to better understand that value?
- How can we design experiments to estimate that value in the presence of substantial network- and congestion effects?
He has led technical teams since 2006 with regular executive communication & public outreach – media fun/press clips here. During the 2020 election cycle, he was Chief Scientist at ACRONYM, where his team drove investment by (correctly) modeling the electoral importance of Georgia, leveraged hundreds of field experiments to build an ML system to produce real-time estimates of persuasive messaging impact, and conducted the largest ever digital advertising field experiment.
At Facebook, he led the technical effort to release a differentially private data set reflecting more than an exabyte of media data, and worked on prototypes that blend ML and experimentation (heterogenous effect estimation and contextual bandits).
He founded the Data Lab at Pew Research Center, where his team conducted algorithmic audits of Google Image searches and news photos on Facebook, used ML to study inauthentic and automated behavior on Twitter, and used NLP to understand the role of ideology and power structures embedded in how members of congress use social media to communicate.
His work on election forecasts sparked public debates about the ethics and consequences of models projecting the results of elections, and prompted FiveThirtyEight to change their user interface to address concerns raised by the work.
His doctoral work at Stanford consisted of research on social influence and social structure in digital environments and probably the largest study of algorithmic bias in social media, which was later published in Science. He also did work to uncover bias in visual portrayals of African American candidates in political attack ads, worked with a lab at Lucile Packard Children’s Hospital on a number of imaging studies and wrote a book, The Impression of Influence with Justin Grimmer and Sean Westwood which was published by Princeton University Press.