Why we created an AI community care agreement

Why we created an AI community care agreement

By Pilar de Haro

The question about the role of AI in journalism is ubiquitous these days. Headlines like “Can Journalism Survive AI?” and “How AI Is Already Transforming the News Business” pop up in our feeds constantly. The focus of these discussions is on the “promise versus peril” of this new and quickly adapting technology for the field, especially related to content, productivity tools and revenue. In the journalism support space that we at Tiny News Collective occupy, the question about AI has come up in another way: around community management and engagement.

After looking at the question from many angles, we’ve decided to create a community care agreement for AI tools so that people can speak openly at our online meetings, while others can record the meeting with AI tools — as long as they follow the guidelines around privacy and data retention so they respect all participants.

At TNC, we cultivate and support a community of news founders who are in the early stages of launching and developing media outlets all over the country. An invaluable part of joining the Collective is being in community with fellow members figuring out the same thing: how to build and sustain a news organization for the first time. 

As part of our ongoing programming, we host multiple group video calls each month to provide folks with a space to discuss freely their journey of news entrepreneurship. Sometimes, that means sharing their growth and experiments, which can excite and inspire new faces on a Zoom call. But it’s not always an easy path, and we’ve learned that sharing the struggles is just as important as celebrating the wins

Creating a brave space

The responsibility of creating a brave space for people to be vulnerable is something that we at TNC work hard to honor. And that’s why we hadn’t, as a rule, recorded these meetings. The most we had done was take notes and share out the high points with members on Slack and on our Member Portal. Sensitive information is never included in those notes.

But sometimes people are busy and can’t make the calls, of course. Life and work get in the way, and they don’t want to miss out on the discussion. So some members had set up an AI assistant to “join” the meetings for them, to record and transcribe the calls. Mostly, that has been through the Otter.ai tool that a lot of journalists use to record and transcribe spoken conversations in real-time, providing a written record of meetings or interviews. Its features include automatic transcription and speaker identification, among others. But that brought up an important community management question: How do we balance the need for privacy and brave spaces with accessibility — another value that we hold dear?

After seeing these AI assistants show up in our gatherings, we knew that we needed to look into the issue. First, I went to our members to get their perspective. We used our community spaces like Slack and collective calls to engage them and get their feedback. Several members underscored how convenient the tool had been when they can’t attend meetings.

Then, we looked into whether other journalism organizations were creating policies on AI assistant tools. I didn’t find much other than those aimed at editorial ethics, such as, “Your newsroom needs an AI ethics policy. Start here” and “Writing guidelines for the role of AI in your newsroom? Here are some, er, guidelines for that.”

Learning from other communities

At that point, our executive director at TNC, Amy Kovac-Ashley, recommended I reach out to the folks at the DEI Coalition for Anti-Racist, Equitable, and Just Newsrooms about their approach to community care. I spoke to Sophie Ho, who led the Safety and Moderation Committee in the creation of the DEI Coalition Slack in 2021. She talked about their process of building principles on what safety means, about being mindful of power dynamics so that folks can speak freely, and about using data retention policies so that conversations aren’t online forever. Another helpful resource was Sisi Wei’s guide for turning private conversations into public resources through community consent, which includes these community agreements

Taking into account the various needs of our members and the advice from Ho and Wei, we created a draft version of what we now call our Community Care Agreement on AI Tools. We circulated the draft document for comments among members, discussed it during our collective calls and then enacted it as a policy this spring. A public excerpt from the agreement is included here. (The original, fuller version lives in our Member Portal accessible to TNC members only.)

In essence, the policy allows for members to record the calls using AI assistants but with some caveats. As advised by Ho, we included a three-month retention period for collective call recordings and used the framework from the DEI Coalition about refraining from sharing other members’ personal information unless consent is provided. That also extends to providing meeting credentials to people who are not members and shouldn’t have access to those spaces, or the recordings of those meetings. 

As our community grows, we are making new members aware of these guidelines and expectations during our onboarding process — and soliciting their feedback as well. We expect this policy will evolve as more members engage with it and as more AI technology is introduced to our community spaces. We want to be ready to respond with care and thoughtfulness.

If you’re thinking about putting into place a community care agreement about AI tools in your community spaces, we’d love to hear from you, share what we know and learn from you. Please get in touch with us here.

Pilar de Haro is the support engineer for Tiny News Collective.

Stay informed

Sign up for our monthly newsletter for ongoing updates, announcements, and resources for newsroom entrepreneurs.