TIKTOK v. GARLAND
Supreme Court Cases
24-656 (2025)
Opinions
Majority Participants
Concurring Participants
NOTICE: This opinion is subject to formal revision before publication in the United States Reports. Readers are requested to notify the Reporter of Decisions, Supreme Court of the United States, Washington, D. C. 20543, pio@supremecourt.gov, of any typographical or other formal errors.
SUPREME COURT OF THE UNITED STATES
_________________
Nos. 24鈥656 and 24鈥657
_________________
TIKTOK INC., et al., PETITIONERS (24鈥656)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
BRIAN 果冻传媒app官方BAUGH, et al., PETITIONERS (24鈥657)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
ON APPLICATIONS FOR INJUNCTION PENDING REVIEW TO THE UNITED STATES COURT OF APPEALS FOR THE DISTRICT OF COLUMBIA CIRCUIT.
[January 17, 2025]
PER CURIAM.
As of January 19, the Protecting Americans from Foreign Adversary Controlled Applications Act will make it unlawful for companies in the United States to provide services to distribute, maintain, or update the social media platform TikTok, unless U. S. operation of the platform is severed from Chinese control. Petitioners are two TikTok operating entities and a group of U. S. TikTok users. We consider whether the Act, as applied to petitioners, violates the First Amendment.
In doing so, we are conscious that the cases before us involve new technologies with transformative capabilities. This challenging new context counsels caution on our part. As Justice Frankfurter advised 80 years ago in considering the application of established legal rules to the 鈥渢otally new problems鈥 raised by the airplane and radio, we should take care not to 鈥渆mbarrass the future.鈥 Northwest Airlines, Inc. v. Minnesota, 322 U.S. 292, 300 (1944). That caution is heightened in these cases, given the expedited time allowed for our consideration.[1] Our analysis must be understood to be narrowly focused in light of these circumstances.
I
A
TikTok is a social media platform that allows users to create, publish, view, share, and interact with short videos overlaid with audio and text. Since its launch in 2017, the platform has accumulated over 170 million users in the United States and more than one billion worldwide. Those users are prolific content creators and viewers. In 2023, U. S. TikTok users uploaded more than 5.5 billion videos, which were in turn viewed more than 13 trillion times around the world.
Opening the TikTok application brings a user to the 鈥淔or You鈥 page鈥攁 personalized content feed tailored to the user鈥檚 interests. TikTok generates the feed using a proprietary algorithm that recommends videos to a user based on the user鈥檚 interactions with the platform. Each interaction a user has on TikTok鈥攚atching a video, following an account, leaving a comment鈥攅nables the recommendation system to further tailor a personalized content feed.
A TikTok user鈥檚 content feed is also shaped by content moderation and filtering decisions. TikTok uses automated and human processes to remove content that violates the platform鈥檚 community guidelines. See 1 App. 493鈥497. TikTok also promotes or demotes certain content to advance its business objectives and other goals. See id., at 499鈥501.
TikTok is operated in the United States by TikTok Inc., an American company incorporated and headquartered in California. TikTok Inc.鈥檚 ultimate parent company is ByteDance Ltd., a privately held company that has operations in China. ByteDance Ltd. owns TikTok鈥檚 proprietary algorithm, which is developed and maintained in China. The company is also responsible for developing portions of the source code that runs the TikTok platform. ByteDance Ltd. is subject to Chinese laws that require it to 鈥渁ssist or cooperate鈥 with the Chinese Government鈥檚 鈥渋ntelligence work鈥 and to ensure that the Chinese Government has 鈥渢he power to access and control private data鈥 the company holds. H. R. Rep. No. 118鈥417, p. 4 (2024) (H. R. Rep.); see 2 App. 673鈥676.
B
1
In recent years, U. S. government officials have taken repeated actions to address national security concerns regarding the relationship between China and TikTok.
In August 2020, President Trump issued an Executive Order finding that 鈥渢he spread in the United States of mobile applications developed and owned by companies in [China] continues to threaten the national security, foreign policy, and economy of the United States.鈥 Exec. Order No. 13942, 3 CFR 412 (2021). President Trump determined that TikTok raised particular concerns, noting that the platform 鈥渁utomatically captures vast swaths of information from its users鈥 and is susceptible to being used to further the interests of the Chinese Government. Ibid. The President invoked his authority under the International Emergency Economic Powers Act (IEEPA), 50 U. S. C. 搂1701 et seq., and the National Emergencies Act, 50 U. S. C. 搂1601 et seq., to prohibit certain 鈥渢ransactions鈥 involving ByteDance Ltd. or its subsidiaries, as identified by the Secretary of Commerce. 3 CFR 413. The Secretary published a list of prohibited transactions in September 2020. See 85 Fed. Reg. 60061 (2020). But federal courts enjoined the prohibitions before they took effect, finding that they exceeded the Executive Branch鈥檚 authority under IEEPA. See generally TikTok Inc. v. Trump, 507 F. Supp. 3d 92 (DC 2020); Marland v. Trump, 498 F. Supp. 3d 624 (ED Pa. 2020).
Just days after issuing his initial Executive Order, President Trump ordered ByteDance Ltd. to divest all interests and rights in any property 鈥渦sed to enable or support ByteDance鈥檚 operation of the TikTok application in the United States,鈥 along with 鈥渁ny data obtained or derived from鈥 U. S. TikTok users. 85 Fed. Reg. 51297. ByteDance Ltd. and TikTok Inc. filed suit in the D. C. Circuit, challenging the constitutionality of the order. In February 2021, the D. C. Circuit placed the case in abeyance to permit the Biden administration to review the matter and to enable the parties to negotiate a non-divestiture remedy that would address the Government鈥檚 national security concerns. See Order in TikTok Inc. v. Committee on Foreign Investment, No. 20鈥1444 (CADC, Feb. 19, 2021).
Throughout 2021 and 2022, ByteDance Ltd. negotiated with Executive Branch officials to develop a national security agreement that would resolve those concerns. Executive Branch officials ultimately determined, however, that ByteDance Ltd.鈥檚 proposed agreement did not adequately 鈥渕itigate the risks posed to U. S. national security interests.鈥 2 App. 686. Negotiations stalled, and the parties never finalized an agreement.
2
Against this backdrop, Congress enacted the Protecting Americans from Foreign Adversary Controlled Applications Act. Pub. L. 118鈥50, div. H, 138 Stat. 955. The Act makes it unlawful for any entity to provide certain services to 鈥渄istribute, maintain, or update鈥 a 鈥渇oreign adversary controlled application鈥 in the United States. 搂2(a)(1). Entities that violate this prohibition are subject to civil enforcement actions and hefty monetary penalties. See 搂搂2(d)(1)(A), (d)(2)(B).
The Act provides two means by which an application may be designated a 鈥渇oreign adversary controlled application.鈥 First, the Act expressly designates any application that is 鈥渙perated, directly or indirectly,鈥 by 鈥淏yteDance Ltd.鈥 or 鈥淭ikTok,鈥 or any subsidiary or successor thereof. 搂2(g)(3)(A). Second, the Act establishes a general designation framework for any application that is both (1) operated by a 鈥渃overed company鈥 that is 鈥渃ontrolled by a foreign adversary,鈥 and (2) 鈥渄etermined by the President to present a significant threat to the national security of the United States,鈥 following a public notice and reporting process. 搂2(g)(3)(B). In broad terms, the Act defines 鈥渃overed company鈥 to include a company that operates an application that enables users to generate, share, and view content and has more than 1,000,000 monthly active users. 搂2(g)(2)(A). The Act excludes from that definition a company that operates an application 鈥渨hose primary purpose is to allow users to post product reviews, business reviews, or travel information and reviews.鈥 搂2(g)(2)(B).
The Act鈥檚 prohibitions take effect 270 days after an application is designated a foreign adversary controlled application. 搂2(a)(2). Because the Act itself designates applications operated by 鈥淏yteDance, Ltd.鈥 and 鈥淭ikTok,鈥 prohibitions as to those applications take effect 270 days after the Act鈥檚 enactment鈥擩anuary 19, 2025.
The Act exempts a foreign adversary controlled application from the prohibitions if the application undergoes a 鈥渜ualified divestiture.鈥 搂2(c)(1). A 鈥渜ualified divestiture鈥 is one that the President determines will result in the application 鈥渘o longer being controlled by a foreign adversary.鈥 搂2(g)(6)(A). The President must further determine that the divestiture 鈥減recludes the establishment or maintenance of any operational relationship between the United States operations of the [application] and any formerly affiliated entities that are controlled by a foreign adversary, including any cooperation with respect to the operation of a content recommendation algorithm or an agreement with respect to data sharing.鈥 搂2(g)(6)(B). The Act permits the President to grant a one-time extension of no more than 90 days with respect to the prohibitions鈥 270-day effective date if the President makes certain certifications to Congress regarding progress toward a qualified divestiture. 搂2(a)(3).
C
ByteDance Ltd. and TikTok Inc.鈥攁long with two sets of TikTok users and creators (creator petitioners)鈥攆iled petitions for review in the D. C. Circuit, challenging the constitutionality of the Act. As relevant here, the petitioners argued that the Act鈥檚 prohibitions, TikTok-specific foreign adversary controlled application designation, and divestiture requirement violate the First Amendment.
The D. C. Circuit consolidated and denied the petitions, holding that the Act does not violate petitioners鈥 First Amendment rights. 122 F. 4th 930, 940, 948鈥965 (CADC 2024). After first concluding that the Act was subject to heightened scrutiny under the First Amendment, the court assumed without deciding that strict, rather than intermediate, scrutiny applied. Id., at 948鈥952. The court held that the Act satisfied that standard, finding that the Government鈥檚 national security justifications鈥攃ountering China鈥檚 data collection and covert content manipulation efforts鈥攚ere compelling, and that the Act was narrowly tailored to further those interests. Id., at 952鈥965.
Chief Judge Srinivasan concurred in part and in the judgment. Id., at 970. In his view, the Act was subject to intermediate scrutiny, id., at 974鈥979, and was constitutional under that standard, id., at 979鈥983.
We granted certiorari to decide whether the Act, as applied to petitioners, violates the First Amendment. 604 U. S. ___ (2024).
II
A
At the threshold, we consider whether the challenged provisions are subject to First Amendment scrutiny. Laws that directly regulate expressive conduct can, but do not necessarily, trigger such review. See R. A. V. v. St. Paul, 505 U.S. 377, 382鈥386 (1992). We have also applied First Amendment scrutiny in 鈥渃ases involving governmental regulation of conduct that has an expressive element,鈥 and to 鈥渟ome statutes which, although directed at activity with no expressive component, impose a disproportionate burden upon those engaged in protected First Amendment activities.鈥 Arcara v. Cloud Books, Inc., 478 U.S. 697, 703鈥704 (1986).
It is not clear that the Act itself directly regulates protected expressive activity, or conduct with an expressive component. Indeed, the Act does not regulate the creator petitioners at all. And it directly regulates ByteDance Ltd. and TikTok Inc. only through the divestiture requirement. See 搂2(c)(1). Petitioners, for their part, have not identified any case in which this Court has treated a regulation of corporate control as a direct regulation of expressive activity or semi-expressive conduct. See Tr. of Oral Arg. 37鈥40. We hesitate to break that new ground in this unique case.
In any event, petitioners鈥 arguments more closely approximate a claim that the Act鈥檚 prohibitions, TikTok-specific designation, and divestiture requirement 鈥渋mpose a disproportionate burden upon鈥 their First Amendment activities. Arcara, 478 U. S., at 704. Petitioners assert鈥攁nd the Government does not contest鈥攖hat, because it is commercially infeasible for TikTok to be divested within the Act鈥檚 270-day timeframe, the Act effectively bans TikTok in the United States. Petitioners argue that such a ban will burden various First Amendment activities, including content moderation, content generation, access to a distinct medium for expression, association with another speaker or preferred editor, and receipt of information and ideas.
We have recognized a number of these asserted First Amendment interests. See Moody v. NetChoice, LLC, 603 U.S. 707, 731 (2024) (鈥淎n entity 鈥榚xercising editorial discretion in the selection and presentation鈥 of content is 鈥榚ngaged in speech activity.鈥 鈥 (quoting Arkansas Ed. Television Comm鈥檔 v. Forbes, 523 U.S. 666, 674 (1998); alteration omitted)); City of Ladue v. Gilleo, 512 U.S. 43, 54鈥58 (1994) (鈥淥ur prior decisions have voiced particular concern with laws that foreclose an entire medium of expression.鈥); Rumsfeld v. Forum for Academic and Institutional Rights, Inc., 547 U.S. 47, 68 (2006) (鈥淲e have recognized a First Amendment right to associate for the purpose of speaking, which we have termed a 鈥榬ight of expressive association.鈥 鈥); Martin v. City of Struthers, 319 U.S. 141, 143 (1943) (鈥淭he right of freedom of speech and press . . . embraces the right to distribute literature and necessarily protects the right to receive it.鈥 (citation omitted)).[2] And an effective ban on a social media platform with 170 million U. S. users certainly burdens those users鈥 expressive activity in a non-trivial way.
At the same time, a law targeting a foreign adversary鈥檚 control over a communications platform is in many ways different in kind from the regulations of non-expressive activity that we have subjected to First Amendment scrutiny. Those differences 鈥 the Act鈥檚 focus on a foreign government, the congressionally determined adversary relationship between that foreign government and the United States, and the causal steps between the regulations and the alleged burden on protected speech 鈥 may impact whether First Amendment scrutiny applies.
This Court has not articulated a clear framework for determining whether a regulation of non-expressive activity that disproportionately burdens those engaged in expressive activity triggers heightened review. We need not do so here. We assume without deciding that the challenged provisions fall within this category and are subject to First Amendment scrutiny.
B
1
鈥淎t the heart of the First Amendment lies the principle that each person should decide for himself or herself the ideas and beliefs deserving of expression, consideration, and adherence.鈥 Turner Broadcasting System, Inc. v. FCC, 512 U.S. 622, 641 (1994) (Turner I ). Government action that suppresses speech because of its message 鈥渃ontravenes this essential right.鈥 Ibid. 鈥淐ontent-based laws鈥攖hose that target speech based on its communicative content鈥攁re presumptively unconstitutional and may be justified only if the government proves that they are narrowly tailored to serve compelling state interests.鈥 Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015). Content-neutral laws, in contrast, 鈥渁re subject to an intermediate level of scrutiny because in most cases they pose a less substantial risk of excising certain ideas or viewpoints from the public dialogue.鈥 Turner I, 512 U. S., at 642 (citation omitted). Under that standard, we will sustain a content-neutral law 鈥渋f it advances important governmental interests unrelated to the suppression of free speech and does not burden substantially more speech than necessary to further those interests.鈥 Turner Broadcasting System, Inc. v. FCC, 520 U.S. 180, 189 (1997) (Turner II ).
We have identified two forms of content-based speech regulation. First, a law is content based on its face if it 鈥渁pplies to particular speech because of the topic discussed or the idea or message expressed.鈥 Reed, 576 U. S., at 163; see id., at 163鈥164 (explaining that some facial distinctions define regulated speech by subject matter, others by the speech鈥檚 function or purpose). Second, a facially content-neutral law is nonetheless treated as a content-based regulation of speech if it 鈥渃annot be 鈥榡ustified without reference to the content of the regulated speech鈥 鈥 or was 鈥渁dopted by the government 鈥榖ecause of disagreement with the message the speech conveys.鈥 鈥 Id., at 164 (quoting Ward v. Rock Against Racism, 491 U.S. 781, 791 (1989)).
As applied to petitioners, the challenged provisions are facially content neutral and are justified by a content- neutral rationale.
a
The challenged provisions are facially content neutral. They impose TikTok-specific prohibitions due to a foreign adversary鈥檚 control over the platform and make divestiture a prerequisite for the platform鈥檚 continued operation in the United States. They do not target particular speech based upon its content, contrast, e.g., Carey v. Brown, 447 U.S. 455, 465 (1980) (statute prohibiting all residential picketing except 鈥減eaceful labor picketing鈥), or regulate speech based on its function or purpose, contrast, e.g., Holder v. Humanitarian Law Project, 561 U.S. 1, 7, 27 (2010) (law prohibiting providing material support to terrorists). Nor do they impose a 鈥渞estriction, penalty, or burden鈥 by reason of content on TikTok鈥攁 conclusion confirmed by the fact that petitioners 鈥渃annot avoid or mitigate鈥 the effects of the Act by altering their speech. Turner I, 512 U. S., at 644. As to petitioners, the Act thus does not facially regulate 鈥減articular speech because of the topic discussed or the idea or message expressed.鈥 Reed, 576 U. S., at 163.
Petitioners argue that the Act is content based on its face because it excludes from the definition of 鈥渃overed company鈥 any company that operates an application 鈥渨hose primary purpose is to allow users to post product reviews, business reviews, or travel information and reviews.鈥 搂2(g)(2)(B); see Brief for Petitioners in No. 24鈥656, pp. 26鈥27 (Brief for TikTok); Brief for Petitioners in No. 24鈥657, p. 26 (Brief for Creator Petitioners). We need not decide whether that exclusion is content based. The question before the Court is whether the Act violates the First Amendment as applied to petitioners. To answer that question, we look to the provisions of the Act that give rise to the effective TikTok ban that petitioners argue burdens their First Amendment rights. The exclusion for certain review platforms, however, applies only to the general framework for designating applications controlled by 鈥渃overed compan[ies],鈥 not to the TikTok-specific designation. 搂搂2(g)(3)(A)鈥(B). As such, the exclusion is not within the scope of petitioners鈥 as-applied challenge.
b
The Government also supports the challenged provisions with a content-neutral justification: preventing China from collecting vast amounts of sensitive data from 170 million U. S. TikTok users. 2 App. 628. That rationale is decidedly content agnostic. It neither references the content of speech on TikTok nor reflects disagreement with the message such speech conveys. Cf. Ward, 491 U. S., at 792鈥793 (holding noise control and sound quality justifications behind city sound amplification guideline were content neutral).
Because the data collection justification reflects a 鈥減urpos[e] unrelated to the content of expression,鈥 it is content neutral. Id., at 791.
2
The Act鈥檚 TikTok-specific distinctions, moreover, do not trigger strict scrutiny. See Brief for TikTok 26鈥27; Brief for Creator Petitioners 24鈥26. It is true that 鈥淸s]peech restrictions based on the identity of the speaker are all too often simply a means to control content.鈥 Citizens United v. Federal Election Comm鈥檔, 558 U.S. 310, 340 (2010). For that reason, 鈥淸r]egulations that discriminate among media, or among different speakers within a single medium, often present serious First Amendment concerns.鈥 Turner I, 512 U. S., at 659. But while 鈥渓aws favoring some speakers over others demand strict scrutiny when the legislature鈥檚 speaker preference reflects a content preference,鈥 id., at 658, such scrutiny 鈥渋s unwarranted when the differential treatment is 鈥榡ustified by some special characteristic of 鈥 the particular [speaker] being regulated,鈥 id., at 660鈥661 (quoting Minneapolis Star & Tribune Co. v. Minnesota Comm鈥檙 of Revenue, 460 U.S. 575, 585 (1983)).
For the reasons we have explained, requiring divestiture for the purpose of preventing a foreign adversary from accessing the sensitive data of 170 million U. S. TikTok users is not 鈥渁 subtle means of exercising a content preference.鈥 Turner I, 512 U. S., at 645. The prohibitions, TikTok-specific designation, and divestiture requirement regulate TikTok based on a content-neutral data collection interest. And TikTok has special characteristics鈥攁 foreign adversary鈥檚 ability to leverage its control over the platform to collect vast amounts of personal data from 170 million U. S. users鈥攖hat justify this differential treatment. 鈥淸S]peaker distinctions of this nature are not presumed invalid under the First Amendment.鈥 Ibid.
While we find that differential treatment was justified here, however, we emphasize the inherent narrowness of our holding. Data collection and analysis is a common practice in this digital age. But TikTok鈥檚 scale and susceptibility to foreign adversary control, together with the vast swaths of sensitive data the platform collects, justify differential treatment to address the Government鈥檚 national security concerns. A law targeting any other speaker would by necessity entail a distinct inquiry and separate considerations.
On this understanding, we cannot accept petitioners鈥 call for strict scrutiny. No more than intermediate scrutiny is in order.
C
As applied to petitioners, the Act satisfies intermediate scrutiny. The challenged provisions further an important Government interest unrelated to the suppression of free expression and do not burden substantially more speech than necessary to further that interest.[3]
1
The Act鈥檚 prohibitions and divestiture requirement are designed to prevent China鈥攁 designated foreign adversary鈥攆rom leveraging its control over ByteDance Ltd. to capture the personal data of U. S. TikTok users. This objective qualifies as an important Government interest under intermediate scrutiny.
Petitioners do not dispute that the Government has an important and well-grounded interest in preventing China from collecting the personal data of tens of millions of U. S. TikTok users. Nor could they. The platform collects extensive personal information from and about its users. See H. R. Rep., at 3 (Public reporting has suggested that TikTok鈥檚 鈥渄ata collection practices extend to age, phone number, precise location, internet address, device used, phone contacts, social network connections, the content of private messages sent through the application, and videos watched.鈥); 1 App. 241 (Draft National Security Agreement noting that TikTok collects user data, user content, behavioral data (including 鈥渒eystroke patterns and rhythms鈥), and device and network data (including device contacts and calendars)). If, for example, a user allows TikTok access to the user鈥檚 phone contact list to connect with others on the platform, TikTok can access 鈥渁ny data stored in the user鈥檚 contact list,鈥 including names, contact information, contact photos, job titles, and notes. 2 id., at 659. Access to such detailed information about U. S. users, the Government worries, may enable 鈥淐hina to track the locations of Federal employees and contractors, build dossiers of personal information for blackmail, and conduct corporate espionage.鈥 3 CFR 412. And Chinese law enables China to require companies to surrender data to the government, 鈥渕aking companies headquartered there an espionage tool鈥 of China. H. R. Rep., at 4.
Rather than meaningfully dispute the scope of the data TikTok collects or the ends to which it may be used, petitioners contest probability, asserting that it is 鈥渦nlikely鈥 that China would 鈥渃ompel TikTok to turn over user data for intelligence-gathering purposes, since China has more effective and efficient means of obtaining relevant information.鈥 Brief for TikTok 50 (internal quotation marks omitted). In reviewing the constitutionality of the Act, however, we 鈥渕ust accord substantial deference to the predictive judgments of Congress.鈥 Turner I, 512 U. S., at 665 (opinion of Kennedy, J.). 鈥淪ound policymaking often requires legislators to forecast future events and to anticipate the likely impact of these events based on deductions and inferences for which complete empirical support may be unavailable.鈥 Ibid. Here, the Government鈥檚 TikTok-related data collection concerns do not exist in isolation. The record reflects that China 鈥渉as engaged in extensive and years-long efforts to accumulate structured datasets, in particular on U. S. persons, to support its intelligence and counterintelligence operations.鈥 2 App. 634.
Even if China has not yet leveraged its relationship with ByteDance Ltd. to access U. S. TikTok users鈥 data, petitioners offer no basis for concluding that the Government鈥檚 determination that China might do so is not at least a 鈥渞easonable inferenc[e] based on substantial evidence.鈥 Turner II, 520 U. S., at 195. We are mindful that this law arises in a context in which 鈥渘ational security and foreign policy concerns arise in connection with efforts to confront evolving threats in an area where information can be difficult to obtain and the impact of certain conduct difficult to assess.鈥 Humanitarian Law Project, 561 U. S., at 34. We thus afford the Government鈥檚 鈥渋nformed judgment鈥 substantial respect here. Ibid.
Petitioners further argue that the Act is under inclusive as to the Government鈥檚 data protection concern, raising doubts as to whether the Government is actually pursuing that interest. In particular, petitioners argue that the Act鈥檚 focus on applications with user-generated and user-shared content, along with its exclusion for certain review platforms, exempts from regulation applications that are 鈥渁s capable as TikTok of collecting Americans鈥 data.鈥 Brief for TikTok 43; see Brief for Creator Petitioners 48鈥49. But 鈥渢he First Amendment imposes no freestanding underinclusiveness limitation,鈥 and the Government 鈥渘eed not address all aspects of a problem in one fell swoop.鈥 Williams-Yulee v. Florida Bar, 575 U.S. 433, 449 (2015) (internal quotation marks omitted). Furthermore, as we have already concluded, the Government had good reason to single out TikTok for special treatment. Contrast Brown v. Entertainment Merchants Assn., 564 U.S. 786, 802 (2011) (singling out purveyors of video games for disfavored treatment without a persuasive reason 鈥渞aise[d] serious doubts about whether the government [wa]s in fact pursuing the interest it invoke[d], rather than disfavoring a particular speaker or viewpoint鈥). On this record, Congress was justified in specifically addressing its TikTok-related national security concerns.
2
As applied to petitioners, the Act is sufficiently tailored to address the Government鈥檚 interest in preventing a foreign adversary from collecting vast swaths of sensitive data about the 170 million U. S. persons who use TikTok. To survive intermediate scrutiny, 鈥渁 regulation need not be the least speech-restrictive means of advancing the Government鈥檚 interests.鈥 Turner I, 512 U. S., at 662. Rather, the standard 鈥渋s satisfied 鈥榮o long as the regulation promotes a substantial government interest that would be achieved less effectively absent the regulation鈥 鈥 and does not 鈥渂urden substantially more speech than is necessary鈥 to further that interest. Ward, 491 U. S., at 799 (quoting United States v. Albertini, 472 U.S. 675, 689 (1985); alteration omitted).
The challenged provisions meet this standard. The provisions clearly serve the Government鈥檚 data collection interest 鈥渋n a direct and effective way.鈥 Ward, 491 U. S., at 800. The prohibitions account for the fact that, absent a qualified divestiture, TikTok鈥檚 very operation in the United States implicates the Government鈥檚 data collection concerns, while the requirements that make a divestiture 鈥渜ualified鈥 ensure that those concerns are addressed before TikTok resumes U. S. operations. Neither the prohibitions nor the divestiture requirement, moreover, is 鈥渟ubstantially broader than necessary to achieve鈥 this national security objective. Ibid. Rather than ban TikTok outright, the Act imposes a conditional ban. The prohibitions prevent China from gathering data from U. S. TikTok users unless and until a qualified divestiture severs China鈥檚 control.
Petitioners parade a series of alternatives鈥攄isclosure requirements, data sharing restrictions, the proposed national security agreement, the general designation provision鈥攖hat they assert would address the Government鈥檚 data collection interest in equal measure to a conditional TikTok ban. Those alternatives do not alter our tailoring analysis.
Petitioners鈥 proposed alternatives ignore the 鈥渓atitude鈥 we afford the Government to design regulatory solutions to address content-neutral interests. Turner II, 520 U. S., at 213. 鈥淪o long as the means chosen are not substantially broader than necessary to achieve the government鈥檚 interest, . . . the regulation will not be invalid simply because a court concludes that the government鈥檚 interest could be adequately served by some less-speech-restrictive alternative.鈥 Ward, 491 U. S., at 800; see ibid. (regulation valid despite availability of less restrictive 鈥渁lternative regulatory methods鈥); Albertini, 472 U. S., at 689; Clark v. Community for Creative Non-Violence, 468 U.S. 288, 299 (1984); Members of City Council of Los Angeles v. Taxpayers for Vincent, 466 U.S. 789, 815鈥816 (1984). For the reasons we have explained, the challenged provisions are 鈥渘ot substantially broader than necessary鈥 to address the Government鈥檚 data collection concerns. Ward, 491 U. S., at 800. Nor did the Government ignore less restrictive approaches already proven effective. Contrast McCullen v. Coakley, 573 U.S. 464, 490鈥494 (2014) (state law burdened substantially more speech than necessary where State had not considered less restrictive measures successfully adopted by other jurisdictions). The validity of the challenged provisions does not turn on whether we agree with the Government鈥檚 conclusion that its chosen regulatory path is best or 鈥渕ost appropriate.鈥 Albertini, 472 U. S., at 689. 鈥淲e cannot displace [the Government鈥檚] judgment respecting content- neutral regulations with our own, so long as its policy is grounded on reasonable factual findings supported by evidence that is substantial for a legislative determination.鈥 Turner II, 520 U. S., at 224. Those requirements are met here.
D
In addition to the data collection concerns addressed above, the Government asserts an interest in preventing a foreign adversary from having control over the recommendation algorithm that runs a widely used U. S. communications platform, and from being able to wield that control to alter the content on the platform in an undetectable manner. See 2 App. 628. In petitioners鈥 view, that rationale is a content-based justification that 鈥渢aint[s]鈥 the Government鈥檚 data collection interest and triggers strict scrutiny. Brief for TikTok 41.
Petitioners have not pointed to any case in which this Court has assessed the appropriate level of First Amendment scrutiny for an Act of Congress justified on both content-neutral and content-based grounds. They assert, however, that the challenged provisions are subject to鈥攁nd fail鈥攕trict scrutiny because Congress would not have passed the provisions absent the foreign adversary control rationale. See Brief for TikTok 41鈥42; Brief for Creator Petitioners 47鈥50. We need not determine the proper standard for mixed-justification cases or decide whether the Government鈥檚 foreign adversary control justification is content neutral. Even assuming that rationale turns on content, petitioners鈥 argument fails under the counterfactual analysis they propose: The record before us adequately supports the conclusion that Congress would have passed the challenged provisions based on the data collection justification alone.
To start, the House Report focuses overwhelmingly on the Government鈥檚 data collection concerns, noting the 鈥渂readth鈥 of TikTok鈥檚 data collection, 鈥渢he difficulty in assessing precisely which categories of data鈥 the platform collects, the 鈥渢ight interlinkages鈥 between TikTok and the Chinese Government, and the Chinese Government鈥檚 ability to 鈥渃oerc[e]鈥 companies in China to 鈥減rovid[e] data.鈥 H. R. Rep., at 3; see id., at 5鈥12 (recounting a five-year record of Government actions raising and attempting to address those very concerns). Indeed, it does not appear that any legislator disputed the national security risks associated with TikTok鈥檚 data collection practices, and nothing in the legislative record suggests that data collection was anything but an overriding congressional concern. We are especially wary of parsing Congress鈥檚 motives on this record with regard to an Act passed with striking bipartisan support. See 170 Cong. Rec. H1170 (Mar. 13, 2024) (352鈥65); 170 Cong. Rec. S2992 (Apr. 23, 2024) (79鈥18).
Petitioners assert that the text of the Act itself undermines this conclusion. In particular, they argue that the Government鈥檚 data collection rationale cannot justify the requirement that a qualified divestiture preclude 鈥渁ny operational relationship鈥 that allows for 鈥渃ooperation with respect to the operation of a content recommendation algorithm or an agreement with respect to data sharing.鈥 搂2(g)(6)(B); see Brief for Creator Petitioners 48鈥49. We disagree. The Government has explained that ByteDance Ltd. uses the data it collects to train the TikTok recommendation algorithm, which is developed and maintained in China. According to the Government, ByteDance Ltd. has previously declined to agree to stop collecting U. S. user data or sending that data to China to train the algorithm. See 2 App. 705鈥706. The Government has further noted the difficulties associated with monitoring data sharing between ByteDance Ltd. and TikTok Inc. See id., at 692鈥697. Under these circumstances, we find the Government鈥檚 data collection justification sufficient to sustain the challenged provisions.
*鈥冣赌*鈥冣赌*
There is no doubt that, for more than 170 million Americans, TikTok offers a distinctive and expansive outlet for expression, means of engagement, and source of community. But Congress has determined that divestiture is necessary to address its well-supported national security concerns regarding TikTok鈥檚 data collection practices and relationship with a foreign adversary. For the foregoing reasons, we conclude that the challenged provisions do not violate petitioners鈥 First Amendment rights.
The judgment of the United States Court of Appeals for the District of Columbia Circuit is affirmed.
It is so ordered.
Notes
[1] Applications for an injunction pending review were filed on December 16, 2024; we construed the applications as petitions for a writ of certiorari and granted them on December 18, 2024; and oral argument was held on January 10, 2025.
[2] To the extent that ByteDance Ltd.鈥檚 asserted expressive activity occurs abroad, that activity is not protected by the First Amendment. See Agency for Int鈥檒 Development v. Alliance for Open Society Int鈥檒 Inc., 591 U.S. 430, 436 (2020) (鈥淸F]oreign organizations operating abroad have no First Amendment rights.鈥).
[3] Our holding and analysis are based on the public record, without reference to the classified evidence the Government filed below.
SUPREME COURT OF THE UNITED STATES
_________________
Nos. 24鈥656 and 24鈥657
_________________
TIKTOK INC., et al., PETITIONERS (24鈥656)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
BRIAN 果冻传媒app官方BAUGH, et al., PETITIONERS (24鈥657)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
ON APPLICATIONS FOR INJUNCTION PENDING REVIEW TO THE UNITED STATES COURT OF APPEALS FOR THE DISTRICT OF COLUMBIA CIRCUIT.
[January 17, 2025]
Justice Sotomayor, concurring in part and concurring in the judgment.
I join all but Part II.A of the Court鈥檚 per curiam opinion. I see no reason to assume without deciding that the Act implicates the First Amendment because our precedent leaves no doubt that it does.
TikTok engages in expressive activity by 鈥渃ompiling and curating鈥 material on its platform. Moody v. NetChoice, LLC, 603 U.S. 707, 731 (2024). Laws that 鈥渋mpose a disproportionate burden鈥 upon those engaged in expressive activity are subject to heightened scrutiny under the First Amendment. Arcara v. Cloud Books, Inc., 478 U.S. 697, 704 (1986); see Minneapolis Star & Tribune Co. v. Minnesota Comm鈥檙 of Revenue, 460 U.S. 575, 581鈥585 (1983). The challenged Act plainly imposes such a burden: It bars any entity from distributing TikTok鈥檚 speech in the United States, unless TikTok undergoes a qualified divestiture. The Act, moreover, effectively prohibits TikTok from collaborating with certain entities regarding its 鈥渃ontent recommendation algorithm鈥 even following a qualified divestiture. 搂2(g)(6)(B), 138 Stat. 959. And the Act implicates content creators鈥 鈥渞ight to associate鈥 with their preferred publisher 鈥渇or the purpose of speaking.鈥 Rumsfeld v. Forum for Academic and Institutional Rights, Inc., 547 U.S. 47, 68 (2006). That, too, calls for First Amendment scrutiny.
As to the remainder of the per curiam opinion, I agree that the Act survives petitioners鈥 First Amendment challenge.
SUPREME COURT OF THE UNITED STATES
_________________
Nos. 24鈥656 and 24鈥657
_________________
TIKTOK INC., et al., PETITIONERS
24鈥656v.
MERRICK B. GARLAND, ATTORNEY GENERAL
BRIAN 果冻传媒app官方BAUGH, et al., PETITIONERS
24鈥657v.
MERRICK B. GARLAND, ATTORNEY GENERAL
ON APPLICATIONS FOR INJUNCTION PENDING REVIEW TO THE UNITED STATES COURT OF APPEALS FOR THE DISTRICT OF COLUMBIA CIRCUIT.
[January 17, 2025]
Justice Gorsuch, concurring in judgment.
We have had a fortnight to resolve, finally and on the merits, a major First Amendment dispute affecting more than 170 million Americans. Briefing finished on January 3, argument took place on January 10, and our opinions issue on January 17, 2025. Given those conditions, I can sketch out only a few, and admittedly tentative, observations.
First, the Court rightly refrains from endorsing the government鈥檚 asserted interest in preventing 鈥渢he covert manipulation of content鈥 as a justification for the law before us. Brief for Respondent 37. One man鈥檚 鈥渃overt content manipulation鈥 is another鈥檚 鈥渆ditorial discretion.鈥 Journalists, publishers, and speakers of all kinds routinely make less-than-transparent judgments about what stories to tell and how to tell them. Without question, the First Amendment has much to say about the right to make those choices. It makes no difference that Americans (like TikTok Inc. and many of its users) may wish to make decisions about what they say in concert with a foreign adversary. 鈥淭hose who won our independence鈥 knew the vital importance of the 鈥渇reedom to think as you will and to speak as you think,鈥 as well as the dangers that come with repressing the free flow of ideas. Whitney v. California, 274 U.S. 357, 375 (1927) (Brandeis, J., concurring). They knew, too, that except in the most extreme situations, 鈥渢he fitting remedy for evil counsels is good ones.鈥 Ibid. Too often in recent years, the government has sought to censor disfavored speech online, as if the internet were somehow exempt from the full sweep of the First Amendment. See, e.g., Murthy v. Missouri, 603 U.S. 43, 76鈥78 (2024) (Alito, J., dissenting). But even as times and technologies change, 鈥渢he principle of the right to free speech is always the same.鈥 Abrams v. United States, 250 U.S. 616, 628 (1919) (Holmes, J., dissenting).
Second, I am pleased that the Court declines to consider the classified evidence the government has submitted to us but shielded from petitioners and their counsel. Ante, at 13, n. 3. Efforts to inject secret evidence into judicial proceedings present obvious constitutional concerns. Usually, 鈥渢he evidence used to prove the Government鈥檚 case must be disclosed to the individual so that he has an opportunity to show that it is untrue.鈥 Greene v. McElroy, 360 U.S. 474, 496 (1959). Maybe there is a way to handle classified evidence that would afford a similar opportunity in cases like these. Maybe, too, Congress or even the Standing Committee on Rules of Practice and Procedure would profit from considering the question. Cf. United States v. Zubaydah, 595 U.S. 195, 245 (2022) (Gorsuch, J., dissenting). But as the Court recognizes, we have no business considering the government鈥檚 secret evidence here.
Third, I harbor serious reservations about whether the law before us is 鈥渃ontent neutral鈥 and thus escapes 鈥渟trict scrutiny.鈥 See ante, at 9鈥12; Brief for Petitioners in No. 24鈥656, pp. 25鈥31; Brief for Petitioners in No. 24鈥657, pp. 24鈥26; Reply Brief in No. 24鈥656, pp. 10鈥12; Reply Brief in No. 24鈥657, pp. 8鈥11. More than that, while I do not doubt that the various 鈥渢iers of scrutiny鈥 discussed in our case law鈥斺渞ational basis, strict scrutiny, something(s) in between鈥濃攃an help focus our analysis, I worry that litigation over them can sometimes take on a life of its own and do more to obscure than to clarify the ultimate constitutional questions. Riddle v. Hickenlooper, 742 F.3d 922, 932 (CA10 2014) (Gorsuch, J., concurring).
Fourth, whatever the appropriate tier of scrutiny, I am persuaded that the law before us seeks to serve a compelling interest: preventing a foreign country, designated by Congress and the President as an adversary of our Nation, from harvesting vast troves of personal information about tens of millions of Americans. The record before us establishes that TikTok mines data both from TikTok users and about millions of others who do not consent to share their information. 2 App. 659. According to the Federal Bureau of Investigation, TikTok can access 鈥any data鈥 stored in a consenting user鈥檚 鈥渃ontact list鈥濃攊ncluding names, photos, and other personal information about unconsenting third parties. Ibid. (emphasis added). And because the record shows that the People鈥檚 Republic of China (PRC) can require TikTok鈥檚 parent company 鈥渢o cooperate with [its] efforts to obtain personal data,鈥 there is little to stop all that information from ending up in the hands of a designated foreign adversary. Id., at 696; see id., at 673鈥676; ante, at 3. The PRC may then use that information to 鈥渂uild dossiers . . . for blackmail,鈥 鈥渃onduct corporate espionage,鈥 or advance intelligence operations. 1 App. 215; see 2 App. 659. To be sure, assessing exactly what a foreign adversary may do in the future implicates 鈥渄elicate鈥 and 鈥渃omplex鈥 judgments about foreign affairs and requires 鈥渓arge elements of prophecy.鈥 Chicago & Southern Air Lines, Inc. v. Waterman S. S. Corp., 333 U.S. 103, 111 (1948) (Jackson, J., for the Court). But the record the government has amassed in these cases after years of study supplies compelling reason for concern.
Finally, the law before us also appears appropriately tailored to the problem it seeks to address. Without doubt, the remedy Congress and the President chose here is dramatic. The law may require TikTok鈥檚 parent company to divest or (effectively) shutter its U. S. operations. But before seeking to impose that remedy, the coordinate branches spent years in negotiations with TikTok exploring alternatives and ultimately found them wanting. Ante, at 4. And from what I can glean from the record, that judgment was well founded.
Consider some of the alternatives. Start with our usual and preferred remedy under the First Amendment: more speech. Supra, at 2. However helpful that might be, the record shows that warning users of the risks associated with giving their data to a foreign-adversary-controlled application would do nothing to protect nonusers鈥 data. 2 App. 659鈥660; supra, at 3. Forbidding TikTok鈥檚 domestic operations from sending sensitive data abroad might seem another option. But even if Congress were to impose serious criminal penalties on domestic TikTok employees who violate a data-sharing ban, the record suggests that would do little to deter the PRC from exploiting TikTok to steal Americans鈥 data. See 1 App. 214 (noting threats from 鈥渕alicious code, backdoor vulnerabilities, surreptitious surveillance, and other problematic activities tied to source code development鈥 in the PRC); 2 App. 702 (鈥淸A]gents of the PRC would not fear monetary or criminal penalties in the United States鈥). The record also indicates that the 鈥渟ize鈥 and 鈥渃omplexity鈥 of TikTok鈥檚 鈥渦nderlying software鈥 may make it impossible for law enforcement to detect violations. Id., at 688鈥689; see also id., at 662. Even setting all these challenges aside, any new compliance regime could raise separate constitutional concerns鈥攆or instance, by requiring the government to surveil Americans鈥 data to ensure that it isn鈥檛 illicitly flowing overseas. Id., at 687 (suggesting that effective enforcement of a data-export ban might involve 鈥渄irect U. S. government monitoring鈥 of the 鈥渇low of U. S. user data鈥).
Whether this law will succeed in achieving its ends, I do not know. A determined foreign adversary may just seek to replace one lost surveillance application with another. As time passes and threats evolve, less dramatic and more effective solutions may emerge. Even what might happen next to TikTok remains unclear. See Tr. of Oral Arg. 146鈥147. But the question we face today is not the law鈥檚 wisdom, only its constitutionality. Given just a handful of days after oral argument to issue an opinion, I cannot profess the kind of certainty I would like to have about the arguments and record before us. All I can say is that, at this time and under these constraints, the problem appears real and the response to it not unconstitutional. As persuaded as I am of the wisdom of Justice Brandeis in Whitney and Justice Holmes in Abrams, their cases are not ours. See supra, at 2. Speaking with and in favor of a foreign adversary is one thing. Allowing a foreign adversary to spy on Americans is another.