Türkiye'de Mostbet çok saygın ve popüler: en yüksek oranlarla spor bahisleri yapmayı, evden çıkmadan online casinoları oynamayı ve yüksek bonuslar almayı mümkün kılıyor.
Search for:
Polskie casino Mostbet to setki gier, zakłady sportowe z wysokimi kursami, gwarancja wygranej, wysokie bonusy dla każdego.
  • Home/
  • Technology/
  • TikTok opens transparency center as lawmakers consider US ban

TikTok opens transparency center as lawmakers consider US ban

TikTok is staring down the barrel of an outright ban in the US It’s already been banned on federal employee devices, blocked by dozens of universities across the country, and lawmakers are calling for its removal from US app stores.

It’s in that context that earlier this week, I and a handful of other journalists were invited to the company’s Los Angeles headquarters for the first media tour of the Transparency and Accountability Center. It’s a space that, like the political discussion of TikTok these days, seems more about signaling virtue than anything else. Company officials say the center is designed for regulators, academics and auditors to learn more about how the app works and its security practices. We were told that a politician-not-to-be-named had toured it the day before. TikTok eventually plans to open more centers in Washington, DC, Dublin and Singapore.

Our tour was part of a multi-week press blitz by TikTok to push Project Texas, a new proposal to the US government that would shield US user data rather than a full ban. TikTok CEO Shou Zi Chew was in DC last week to give a similar pitch to policy makers and think tanks. He is expected to testify before Congress for the first time in March.

What you see when you first enter TikTok's Transparency Center.

What you see when you first enter TikTok’s Transparency Center.
Photo by Allison Zaucha for The Verge

TikTok isn’t the first contentious tech company to lean on the spectacle of a physical space during a PR crisis. In 2018, Facebook invited journalists to tour the “War Room,” which was really just a glorified conference room filled with employees staring at social media feeds and dashboards. Pictures were taken, stories were written, and about a month later the War Room was closed.

Similarly, TikTok’s Transparency Center is a lot of smoke and mirrors designed to give the impression that it real gives. Large touchscreens explain how TikTok works at a high level, along with a broad overview of the kind of trust and safety efforts that have become the stakes for every major platform.

An important difference, however, is a room that my travel group was not allowed to enter. Behind a wall with Death Star-like mood lighting, TikTok officials said a server room holds the app’s source code for third-party auditors to review. Anyone entering must sign a non-disclosure agreement, go through metal detectors, and lock their phone in a locker. (It was not clear exactly who would be allowed to enter the room.)

A room where you can interact with a fake version of the moderation software TikTok uses.

A room where you can interact with a fake version of the moderation software TikTok uses.
Photo by Allison Zaucha for The Verge

The interactive part of the center I got to experience included a room of iMacs running a fake version of the software TikTok says its moderators use to rate content. There was another room of iMacs running “code simulators.” While that sounded intriguing, it was really just a basic explanation of the TikTok algorithm that seemed designed to be understood by a typical congressman. Close-up photos of the computer screens were not allowed. And despite being called a Transparency Center, TikTok’s PR department made sure everyone agreed not to quote or directly attribute comments made by staff leading the tour.

On the moderator’s workstation, I was presented with a number of potentially violating videos to review, along with basic information such as the accounts they had posted to and the number of likes and reshares of each video. When I quoted one of a man talking into the camera with the caption “the world is bringing up 9/11 to justify Muslims as t3rrori$ts” the moderator system asked me to select if it violated any of the three policies, including one on “threats and incitement to violence”.

At the code simulator iMac in the other room, I hoped to learn more about how TikTok’s recommendation system really works. After all, this was a physical place you had to travel to. Surely there would be information that I can’t find anywhere else?

What I got was this: TikTok is starting to use a “coarse machine learning model” to select “a subset of a few thousand videos” from the billions hosted by the app. Then a “medium machine learning model further narrows the recall pool to a smaller pool of videos” that it thinks you’ll be interested in. like on your For You page.

The information displayed was frustratingly vague. One slide stated that TikTok “recommends content by ranking videos based on a combination of factors, including the interests new users bring to TikTok when they first use the app, and changing preferences over time.” ” That’s exactly how you’d expect it to work.

Eric Han, Head of USDS Trust and Safety at TikTok.

Eric Han, Head of USDS Trust and Safety at TikTok.
Photo by Allison Zaucha for The Verge

TikTok first attempted to open this transparency center in 2020, when then-President Donald Trump tried to ban the app and Kevin Mayer was its CEO for three months. But then the pandemic broke out, delaying the opening of the center until now.

Over the past three years, TikTok’s trust deficit in DC has only widened, fueled by growing anti-China sentiment that started on the right and has since become more bipartisan. The worst revelation came in late December when the company confirmed that employees improperly accessed the location data of several US journalists as part of an internal leak investigation. That same month, FBI Director Chris Wray warned that China could use TikTok to “manipulate content and, if they want, use it for influence operations.”

TikTok’s answer to these concerns is Project Texas, a highly technical, unprecedented plan that would foreclose most of TikTok’s US operations from Chinese parent company ByteDance. To make Project Texas a reality, TikTok relies on Oracle, whose billionaire founder Larry Ellison used his connections as an influential Republican donor to in person to secure Trump’s blessing in the early stages of the negotiations. (No one from Oracle was present at the briefing I attended, and my request to speak to someone there for this story went unanswered.)

Photo by Allison Zaucha for The Verge

I was given a brief overview of Project Texas before the tour, though I was asked not to quote the employees who presented directly. An image I was shown showed a Supreme Court-style building with five pillars that show the issues Project Texas is meant to address: org design, data protection and access control, technical assurance, content assurance, and compliance and monitoring.

TikTok says it has already taken thousands of people and more than $1.5 billion to create Project Texas. The effort involves TikTok creating a separate legal entity called USDS with an independent board from ByteDance reporting directly to the US government. More than seven third-party auditors, including Oracle, will review all data flowing in and out of the US version of TikTok. Only US user data will be available to train the algorithm in the US, and TikTok says there will be strict compliance requirements for any internal access to US data. If approved by the government, the proposal is estimated to cost TikTok $700 million to $1 billion a year to maintain.

Whether Project Texas pleases the government or not, it certainly looks like working at TikTok is going to get harder. The US version of TikTok must be completely deconstructed, rebuilt and published by Oracle on US app stores. Oracle will also have to review each app update. Duplicate roles are being created for TikTok in the US, even if the same roles already exist elsewhere for TikTok. And app performance can suffer when Americans interact with users and content in other countries, as US user data must be managed in the country.

Photo by Allison Zaucha for The Verge

One name that was not mentioned during the entire briefing: ByteDance. I got the impression that TikTok employees were uncomfortable talking about their relationship with their parent company.

While ByteDance was not directly acknowledged, its ties to TikTok were not hidden either. The Wi-Fi for the building I was in was called ByteDance, and the conference room screens in the Transparency Center showed Lark, the internal communication tool ByteDance developed for its employees around the world. At one point during the tour, I tried to ask what would hypothetically happen if, once Project Texas gets the go-ahead, a Bytedance employee in China makes an awkward request to an employee of TikTok’s US entity. I was quickly told by a member of TikTok’s PR team that the question was not appropriate for the tour.

In the end, I was left with the feeling that TikTok, like its powerful algorithm, built its Transparency Center to show people what it thinks they want to see. The company seems to have realized it won’t save itself from a US ban on the tech merits of its Project Texas proposal. The debate is now purely a matter of politics and optics. Unlike the tour I’ve been on, that’s something TikTok has no control over.

Shreya has been with australiabusinessblog.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider australiabusinessblog.com, Shreya seeks to understand an audience before creating memorable, persuasive copy.

Leave A Comment

All fields marked with an asterisk (*) are required