Categories
Audio Sources - Full Text Articles

Congress, TikTok, and Securing Democracy in the Digital Age

Advertisements | Advertising at The News And Times - advertising-newsandtimes.com | WE CONNECT!

Listen to this article

52617580169_e526f356c4_o_1.jpg

In recent weeks, Congress has asserted itself in the TikTok debate by passing legislation that bans the app from federal government devices. Another bipartisan bill, known as the ANTI-SOCIAL CCP Act, would go much further by completely prohibiting TikTok from operating in the United States. With these actions, Congress is taking on more responsibility in deciding the short-form video sharing service’s future. Given the potentially massive impact of whether the app is allowed to continue operating in the U.S., Congress’s engagement and action in this matter are welcome and necessary to help steer—at a critical moment—the direction of the still-evolving U.S. technology strategy toward China.

Since late 2019, TikTok has been under a secretive national security investigation conducted by the Committee on Foreign Investment in the United States. Known as CFIUS, the executive branch committee reviews the national security implications of foreign transactions with U.S. entities and recommends how to address those concerns to the president. For example, in December 2016, President Obama, acting on CFIUS’s recommendation, prohibited a Chinese-owned company’s purchase of a U.S. manufacturer of technology used in U.S. weapons systems.

Some observers have criticized CFIUS for taking more than three years to make a decision on TikTok. The more vital issue and primary reason why Congress should continue to engage is that TikTok implicates more than relatively narrow national security concerns, such as those raised by the transaction blocked by Obama in 2016. Instead, TikTok raises key questions about how the U.S. government should defend the integrity of the U.S.’s fundamentally open, democratic, and rule-bound society in this time of growing competition with the Chinese Communist Party (CCP). Addressing these questions requires a more holistic approach to national interests that goes beyond the scope of CFIUS’s authority and mandate.

TikTok is owned by ByteDance, a global technology company headquartered in Beijing that has a significant business presence in China. Critics argue that ByteDance’s ties to China make it susceptible to CCP and Chinese government control and influence via several mechanisms. For example, China’s National Intelligence Law is broadly worded to require all organizations—including a company like ByteDance—and Chinese citizens—including ByteDance employees—to participate in domestic and foreign intelligence activities. Second, the Chinese government has an ownership stake in Beijing Douyin Information Service, the ByteDance subsidiary and TikTok affiliate that runs ByteDance’s China businesses. Though few details are known about the investment, Weibo (China’s version of Twitter) has disclosed a similar arrangement that gives the Chinese government “veto rights over certain matters related to” decisions about content. A third potential pressure point is China’s export control law, which covers TikTok’s video recommendation technology and thus gives the government leverage over the technology’s transfer outside of China.

The CCP’s control and influence over ByteDance raise two core national security concerns about TikTok. The first is that the app could be, and perhaps already is, a mechanism for the CCP to collect the personal information of TikTok users in the U.S. and use it against them. The second is that the CCP could use TikTok for censorship, disinformation, and propaganda campaigns targeting U.S. elections and other essential parts of American society. These concerns and their associated risks lead many to see TikTok as a CCP-controlled information operations tool—a view that makes a ban a reasonably straightforward and acceptable outcome.

One factor that complicates the picture is that millions of people in the U.S. use the TikTok app as a platform for expression protected by the First Amendment. This is a critical reason why Congress should wrestle with TikTok’s fate instead of leaving the matter solely to CFIUS, which is designed to focus only on identifying security concerns, assessing risks, and putting forth solutions for mitigating or eliminating those risks. 

By contrast, Congress has the authority and responsibility to weigh the First Amendment implications—both substantive and symbolic—of shutting down a platform like TikTok. In fact, in cases of national emergency, when a president has the broadest and strongest powers, Congress reserves the right to limit free expression and withholds that authority from the president. Specifically, the International Emergency Economic Powers Act (IEEPA) gives the president broad emergency powers to address foreign threats but denies the executive the authority to ban personal communications and the import or export of any kind of information or informational materials, regardless of the medium, during the emergency.

The ANTI-SOCIAL CCP Act, however, would remove IEEPA’s First Amendment guardrails and simply direct the president to ban TikTok without speaking to free expression concerns. Rather than remaining silent, Congress should consider free expression and other core American values explicitly as part of the TikTok national security analysis and any related legislation it puts forward. Doing so would highlight that one of the U.S.’s most valuable assets is its embrace of these values—a crucial element of U.S. national influence and international credibility.

Taking into account the free expression consequences of a TikTok ban does not necessarily lead to keeping the status quo or letting TikTok continue operating with special restrictions. For example, some members of Congress might agree that TikTok is an expression platform. But they could also correctly point to substantially similar services as viable alternatives for expression, which would help address First Amendment concerns about a prohibition on TikTok. India’s ban on TikTok provides useful data points.

Since 2020, the Indian government has banned more than 300 apps owned by or affiliated with Chinese companies under the country’s Information Technology Act. Concerns have been raised about both the process of the bans, which has been opaque, and their impact on expression. The record so far indicates that, despite concerns that banning TikTok would reduce the number of avenues for online expression in India, several home-grown Indian TikTok-like apps have started to fill the void. Among them is ShareChat’s Moj, which has roughly 100 million creators and more than 300 million users—180 million more than TikTok’s India user base shortly before it was banned. Foreign video apps, including YouTube and Instagram, have also expanded their footprint in the wake of the TikTok ban.

Congress could also conclude that a middle ground—allowing TikTok to continue operating in the U.S. but with significant restrictions—would entangle the government in policing speech so much that a TikTok ban or a forced sale of the company is simply the lesser of two evils. In this middle-ground scenario, the U.S. government would oversee TikTok’s video recommendation system and the definition and execution of its content moderation policies. This power would likely be exercised through a third party entrusted to supervise, audit, report on, and perhaps modify TikTok’s code and content regulation policies. 

TikTok has stated that it is preparing for a middle-ground outcome by putting controls in place pursuant to a business partnership with Oracle Corporation. For example, under the agreement, Oracle will be responsible for ensuring that TikTok’s algorithm is only trained on Oracle infrastructure, will “ensure appropriate third-party security vetting and validation of the algorithm,” and will, along with third-party auditors and monitors, have access to TikTok source code. Oracle is also tasked with reviewing TikTok’s content moderation policies and practices, which would involve assessing TikTok’s automated content moderation systems and human content review teams to ensure that both act in accordance with TikTok’s community guidelines and other content policies and are not manipulated by the CCP. 

Though these controls might prove effective, Oracle’s work itself would require the U.S. government’s supervision to ensure that concerns about TikTok are adequately addressed. Putting the government in the position of influencing content moderation calls—some of which might ultimately have little or nothing to do with CCP influence operations—could raise reasonable First Amendment concerns and lead to the conclusion that divestment or a ban is a cleaner approach. As well, members of Congress could determine that U.S. government engagement with TikTok’s content policies and practices would effectively set the bar for other media companies’ content moderation efforts, thus constituting a subtle and concerning way of regulating expression beyond TikTok.

As the above analysis suggests, these discussions and decisions can happen largely in the open rather than in classified settings, which is especially salient when a decision is about a service that is so close to and increasingly ingrained in the daily lives of millions of Americans. TikTok has roughly 100 million U.S. users: approximately one out of three people over 13, the minimum age in the U.S. to sign up for social media services. Unlike other government decisions that decouple America’s technology infrastructure from China’s—including restrictions on exporting advanced integrated circuits to China, bans on the sale in the U.S. of Chinese telecommunications equipment and devices, and prohibitions on connecting U.S. companies’ fiber optic cable networks to China—a decision to ban TikTok would separate Americans from a Chinese technology product in a very visible and tangible way. Members of Congress are best positioned to understand the impact of a TikTok ban or restriction on the day-to-day lives of their constituents and explain such a government action to them—preferably as one that stems from elected representatives rather than less visible executive branch officials.

Leaving the TikTok decision solely to CFIUS also means missing an opportunity for Congress to debate and take action on digital policy matters in a specific and concrete context. For example, one of the critical issues for both Republicans and Democrats raised by TikTok is algorithmic manipulation, especially by malign actors attempting to harm individuals or vital American institutions and processes like free and fair elections. As noted above, CFIUS could address concerns about CCP algorithmic manipulation by imposing mitigations on TikTok, such as requiring a third party to monitor its algorithms for CCP manipulation. 

With proposed legislation like the ANTI-SOCIAL CCP Act as a catalyst, Congress could look more broadly at existing legislative and regulatory models that could be applied to online services subject to the jurisdiction, control, or influence of the CCP. For example, if Congress allows TikTok to continue operating in the U.S. (or if it bans TikTok but wants to put in place algorithmic transparency rules for other services susceptible to the CCP), it could draw from the European Union’s Digital Services Act (DSA) as a starting point on algorithmic transparency and accountability rules for TikTok and similar services. Among other things, the DSA requires platforms to tell users how they recommend content, including the most significant signals influencing what information they present (TikTok and others are starting to implement preliminary measures consistent with the DSA’s requirements).

A review of TikTok would also allow Congress to ask questions about and signal positions on some deeply important yet not often debated questions stemming from the separation of America’s tech ecosystem from China’s. For example, ByteDance might be, as some critics suggest, a puppet of the CCP. Yet it is also a corporation headed by a CEO who has moved to Singapore, overseen by a five-person board of directors with three members who represent U.S. venture capital firms (a fourth represents a China affiliate of a U.S. venture firm), and that seems to be trying to find a way to exist as a global company at least partly beyond the reach of the CCP. Should the U.S. government help pull ByteDance away from the CCP by allowing it to continue operating with mitigations required to access the U.S. market? More broadly, what should the U.S. do with what could be seen as contestable companies with Chinese ties and the entrepreneurs and technical talent that are growing those companies? Beyond China, Congress could explore the impact of the TikTok matter on U.S. government efforts to protect cross-border data flows and push back on restrictive forms of digital sovereignty in various regions of the world. 

These questions are, like those relating to free expression, beyond the scope of a narrow national security analysis. But they are deeply tied to the U.S. government’s commitment to an internet that is “open, free, global, interoperable, reliable, and secure” and “reinforces democratic principles and human rights and fundamental freedoms.” The answer may be to cut Chinese companies off from the technology stack of the U.S., its allies, and partners—as was the case with Huawei, U.S. domestic restrictions on technology tend to flow abroad. An alternative would be to keep these companies connected to U.S. technology infrastructure, standards, and practices that, over time, would grow global corporate participation in the technology ecosystem of the U.S., its partners, and allies, thus shrinking the CCP’s influence in the global tech market. 

Congress could decide to keep the U.S. digital environment open to Chinese tech companies and entrepreneurs with guardrails to ensure that the CCP cannot access personal and sensitive data and standards to promote transparency, adherence to the rule of law, and the protection of human rights. Like the proposed arrangement between TikTok and Oracle, Congress could support a broader policy of encouraging Chinese social media, e-commerce, gaming, and other digital companies to host their workloads on third-party owned and managed cloud infrastructure that keeps data and code out of the hands of the CCP and ensures that the services operate transparently and securely. This path would be tougher than banning apps, but it is a policy direction that Congress should consider as it decides what action to take on TikTok.

Simply by engaging in an open debate on these matters, Congress could ensure that restrictions on or a ban of TikTok are not associated with the world’s app-banning club: China, Russia, Iran, and, unfortunately, India, where the banning of China-affiliated apps is occurring in tandem with broader efforts by the government to block or filter internet content. Regardless of the outcome, debating solutions to the challenges posed by TikTok in congressional committees and on the Senate and House floors sends a strong signal of the transparency and openness of America’s decision-making processes. This is critically important as the U.S. rivalry with China increasingly becomes a competition of contrasts that puts to the test open, democratic, and rule-bound societies and systems of government against their opposite.

There are, of course, legitimate objections to Congress taking a more central role in the TikTok decision. For one, passing legislation often takes more time than executive branch decisions: even TikTok’s three-plus-year review. Another concern is that Congress would overly politicize TikTok or treat it less rigorously than would CFIUS, especially during a time of increased polarization in Congress that might bring more heat than light to the matter. Finally, Congress does not have the time or resources to consider every China-related tech transaction that merits a national security review, which is why executive branch entities like CFIUS are needed.

Congress’s capacity to review China-related transactions is certainly limited. What Congress could do with its limited bandwidth, however, is take the TikTok matter as a high-profile opportunity to evolve the national security review of transactions involving Chinese technology and align it more closely with the principles of a liberal democracy.

As the national security review of technology-related transactions involving China becomes more commonplace, Congress could explicitly broaden the scope of national security assessments to include important variables such as the potential impact on core rights and the effect on American projection of soft power. Similar broader authority already exists with the Federal Communications Commission  (FCC), which has traditionally been and continues to be primarily responsible for assessing proposed foreign ownership of U.S. media companies. With the assistance of the Committee for the Assessment of Foreign Participation in the United States Telecommunications Services Sector (known informally as Team Telecom), the FCC reviews such transactions for law enforcement, foreign policy, and trade policy concerns, in addition to national security issues. As the analysis above suggests, taking into account these matters enables a more complete assessment of how U.S. government action on a particular transaction will impact the bigger picture of U.S.-China competition and the core values and principles of American society, which are among the key assets the U.S. government endeavors to protect.

Second, Congress should consider unifying the review of Chinese (and perhaps other challenging countries’) technology-related transactions under one government body to ensure consistency, create efficiencies, and grow and retain executive branch expertise on technology-related national security issues. As noted above, CFIUS is not the only U.S. government organization conducting national security reviews of technology transactions with China-affiliated entities. The FCC, with the support of Team Telecom, conducts these reviews as well. 

Another similar review body is the Commerce Department’s Bureau of Industry and Security (BIS), which administers the information and communications technology and services (ICTS) supply chain regulations promulgated under the Trump administration and proposed to be amended by the Biden administration to explicitly cover apps. Both CFIUS and BIS have broad authority to review, modify, and block transactions. Because the two organizations stem from different legislative authorities and differ in their processes and scope of authority, over time they risk producing inconsistent and even contradictory results. 

For example, BIS’s ability to hinder free expression via the ICTS supply chain regulations is limited because the rules are promulgated under IEEPA. By contrast, CFIUS is not similarly constrained but is also not intended to even consider such questions. Accordingly, if BIS wished to restrict or outright block a Chinese app, its proposed action might likely be enjoined by the courts while CFIUS could order a similar restriction or ban and probably be given deference by the courts. Avoiding diverging outcomes will be particularly important if the current U.S.-China tension is not an acute moment in time that affects only a few transactions but, rather, a secular move to a more competitive and restricted relationship between the countries that will impact business transactions more broadly and frequently.

Third, such a technology transactions review committee should operate much more transparently than CFIUS and similar U.S. government organizations do today so that the American public can understand clearly what threats the U.S. is addressing and how. This transparency could also help build precedent to guide the committee and establish expectations among relevant stakeholders. In addition, transparency could generate baseline rules, such as official guidance and general licenses, for how companies can proceed with certain tech transactions in a way that addresses the U.S. government’s national security concerns. None of this would have to eliminate case-by-case reviews, but the U.S. government cannot scale to review all relevant transactions. Public disclosure and safe harbor rules based on precedent would help leverage the review committee’s work. 

Whatever one thinks of the policy outcomes of the ANTI-SOCIAL CCP Act and legislation like it, the bill underscores Congress’s responsibility and opportunity to continue engaging in the TikTok matter. TikTok is not just a blip on the radar screen but, rather, a major milestone on what is an increasingly clear but treacherous path toward a tech schism between the U.S. and China and longer-term technology competition with China. With that in mind, Congress has the opportunity to take direct action on a matter that will impact hundreds of millions of Americans, move toward a more holistic approach to U.S. national security in the digital age, and engage in a process that both highlights and leverages the open, democratic governance that the U.S. represents on the global stage.


Advertisements | Advertising at The News And Times - advertising-newsandtimes.com | WE CONNECT!