Youtube

From Bharatpedia, an open encyclopedia
Information red.svg
Scan the QR code to donate via UPI
Dear reader, We need your support to keep the flame of knowledge burning bright! Our hosting server bill is due on June 1st, and without your help, Bharatpedia faces the risk of shutdown. We've come a long way together in exploring and celebrating our rich heritage. Now, let's unite to ensure Bharatpedia continues to be a beacon of knowledge for generations to come. Every contribution, big or small, makes a difference. Together, let's preserve and share the essence of Bharat.

Thank you for being part of the Bharatpedia family!
Please scan the QR code on the right click here to donate.

0%

   

transparency: ₹0 raised out of ₹100,000 (0 supporter)



YouTube
The YouTube logo is made of a red round-rectangular box with a white "play" button inside and the word "YouTube" written in black.
Logo since 2017
Screenshot
YouTube homepage.png
Screenshot of YouTube's front page on August 29, 2017
Type of businessSubsidiary
Type of site
Online video platform
FoundedFebruary 14, 2005; 18 years ago (2005-02-14)
Headquarters901 Cherry Avenue
San Bruno, California,
United States
Area servedWorldwide (excluding blocked countries)
OwnerAlphabet Inc.
Founder(s)
Key peopleSusan Wojcicki (CEO)
Chad Hurley (advisor)
Industry
ProductsYouTube Premium
YouTube Music
YouTube TV
YouTube Kids
RevenueIncrease US$19.8 billion (2020)[1]
ParentGoogle LLC (2006–present)
WebsiteYouTube.com
(see list of localized domain names)
AdvertisingGoogle AdSense
Registration
UsersIncrease 2 billion (October 2020)[2]
LaunchedFebruary 14, 2005; 18 years ago (2005-02-14)
Current statusActive
Content license
Uploader holds copyright (standard license); Creative Commons can be selected.
Written inPython (core/API),[3] C (through CPython), C++, Java (through Guice platform),[4][5] Go,[6] JavaScript (UI)

YouTube is an online video platform owned by Google. In total, users watch more than one billion hours of YouTube videos each day,[7] and hundreds of hours of video content are uploaded to YouTube servers every minute.[8] It was founded by Steve Chen, Chad Hurley, and Jawed Karim.

YouTube provides several ways to watch videos, including the website, the mobile apps, and permitting other websites to embed them. Available content includes music videos, video clips, short and documentary films, audio recordings, movie trailers, live streams, and video blogs. Most content is generated by individuals, but media corporations also publish videos. Besides watching and uploading, registered users can comment on videos, rate them, create playlists, and subscribe to other users.

Founded in 2005, YouTube was acquired the following year by Google for US$1.65 billion. It has become one of the company's most lucrative subsidiaries, earning $19.8 billion in 2020.[1] YouTube and selected creators earn advertising revenue from Google's AdSense program. The vast majority of videos are free to view, but some require a music or premium subscription.

Given the popularity of YouTube and its abundance of video content, the platform has made a significant social impact throughout the world. There have also been numerous controversies regarding the business, moral, and political aspects of YouTube.

History[edit]

Founding and initial growth (2005–2006)[edit]

From left to right: Chad Hurley, Steve Chen and Jawed Karim, the founders of YouTube

YouTube was founded by Steve Chen, Chad Hurley, and Jawed Karim. The trio were all early employees of PayPal which they left enriched after the company was bought by eBay.[9] Hurley had studied design at Indiana University of Pennsylvania, and Chen and Karim studied computer science together at the University of Illinois at Urbana–Champaign.[10]

There are multiple stories told of the company's founding. According to a story that has often been repeated in the media, Hurley and Chen developed the idea for YouTube during the early months of 2005, after they had experienced difficulty sharing videos that had been shot at a dinner party at Chen's apartment in San Francisco. Karim did not attend the party and denied that it had occurred, but Chen commented that the idea that YouTube was founded after a dinner party "was probably very strengthened by marketing ideas around creating a story that was very digestible".[11] Karim said the inspiration for YouTube first came from Janet Jackson's role in the 2004 Super Bowl incident when her breast was exposed during her performance, and later from the 2004 Indian Ocean tsunami. Karim could not easily find video clips of either event online, which led to the idea of a video sharing site.[12] Hurley and Chen said that the original idea for YouTube was a video version of an online dating service, and had been influenced by the website Hot or Not.[11][13] They created posts on Craigslist asking attractive women to upload videos of themselves to YouTube in exchange for a $100 reward.[14] Difficulty in finding enough dating videos led to a change of plans, with the site's founders deciding to accept uploads of any type of video.[15]

The YouTube logo was used from its launch until 2011. Another version of this logo without their "Broadcast Yourself" slogan was used until 2015.

YouTube began as a venture capital–funded technology startup. Between November 2005 and April 2006, the company raised money from a variety of investors with Sequoia Capital, $11.5 million, and Artis Capital Management, $8 million, being the largest two.[9][16] YouTube's early headquarters were situated above a pizzeria and Japanese restaurant in San Mateo, California.[17] In February 2005, the company activated www.youtube.com.[18] The first video was uploaded April 23, 2005. Titled Me at the zoo, it shows co-founder Jawed Karim at the San Diego Zoo and can still be viewed on the site.[19][20] In May the company launched a public beta and by November a Nike ad featuring Ronaldinho became the first video to reach one million total views.[21][22] The site launched officially on December 15, 2005, by which time the site was receiving 8 million views a day.[23][24] Clips at the time were limited to 100 megabytes, as little as 30 seconds of footage.[25]

YouTube was not the first video-sharing site on the Internet, as Vimeo was launched in November 2004, though that site remained a side project of its developers from CollegeHumor at the time and did not grow much either.[26] The week of YouTube's launch, NBC-Universal's Saturday Night Live ran a skit "Lazy Sunday" by The Lonely Island. Besides helping to bolster ratings and long-term viewership for Saturday Night Live, "Lazy Sunday"'s status as an early viral video helped established YouTube as an important website.[27] Unofficial uploads of the skit to YouTube drew in more than five million collective views by February 2006 before they were removed when NBCUniversal requested it two month later based on copyright concerns.[28] Despite eventually being taken down, these duplicate uploads of the skit helped popularize YouTube's reach and led to the upload of further third-party content.[29][30] The site grew rapidly and, in July 2006, the company announced that more than 65,000 new videos were being uploaded every day, and that the site was receiving 100 million video views per day.[31]

The choice of the name www.youtube.com led to problems for a similarly named website, www.utube.com. That site's owner, Universal Tube & Rollform Equipment, filed a lawsuit against YouTube in November 2006 after being regularly overloaded by people looking for YouTube. Universal Tube subsequently changed its website towww.utubeonline.com.[32][33]

Acquisition by Google (2006–2013)[edit]

YouTube's headquarters in San Bruno, California

On October 9, 2006, Google announced that it had acquired YouTube for $1.65 billion in Google stock.[34][35] The deal was finalized on November 13, 2006.[36][37] Google's acquisition launched new newfound interest in video-sharing sites; IAC, which now owned Vimeo focused on supporting the content creator to distinguish itself from YouTube.[26]

YouTube logo from 2015 until 2017

The company experienced rapid growth. The Daily Telegraph wrote that in 2007, YouTube consumed as much bandwidth as the entire Internet in 2000.[38] By 2010, the company had reached a market share of around 43% and more than 14 billion views of videos according to comScore.[39] That year the company also redesigned its interface with the aim of simplifying the interface and increasing the time users spend on the site.[40] In 2011, more than three billion videos were being watched each day with 48 hours of new videos uploaded every minute.[41][42][43] However, most of these views came from a relatively small number of videos; according to a software engineer at that time, 30% of videos accounted for 99% of views on the site.[44] That year the company again changed its interface and at the sametime introduced a new logo with a darker shade of red.[45][46] A further interface change, designed to unify the experience across desktop, TV, and mobile, was rolled out in 2013.[47] By that point more than 100 hours were being uploaded every minute, a number that would increase to 300 hours by November 2014.[48][49]

During this time, the company also went through some organizational changes. In October 2006, YouTube moved to a new office in San Bruno, California.[50] Hurley announced that he would be stepping down as chief executive officer of YouTube to take an advisory role, and that Salar Kamangar would take over as head of the company in October 2010.[51]

New revenue streams (2013–2018)[edit]

YouTube logo since 2017

Susan Wojcicki was appointed CEO of YouTube in February 2014.[52] In January 2016, YouTube expanded its headquarters in San Bruno by purchasing an office park for $215 million. The complex has 51,468 square metres (554,000 square feet) of space and can house up to 2,800 employees.[53] YouTube officially launched the "polymer" redesign of its user interfaces based on Material Design language as its default, as well a redesigned logo that is built around the service's play button emblem in August 2017.[54]

Through this period, YouTube tried several new ways to generate revenue beyond advertisements. In 2013, YouTube launched a pilot program for content providers to offer premium, subscription-based channels within the platform.[55][56] This effort was discontinued in January 2018 and relaunched in June, with US$4.99 channel subscriptions.[57][58] These channel subscriptions complemented the existing Super Chat ability, launched in 2017, which allows viewers to donate between $1 and $500 to have their comment highlighted.[59] In 2014, YouTube announced a subscription service known as "Music Key," which bundled ad-free streaming of music content on YouTube with the existing Google Play Music service.[60] The service continued to evolve in 2015, when YouTube announced YouTube Red, a new premium service that would offer ad-free access to all content on the platform (succeeding the Music Key service released the previous year), premium original series, and films produced by YouTube personalities, as well as background playback of content on mobile devices. YouTube also released YouTube Music, a third app oriented towards streaming and discovering the music content hosted on the YouTube platform.[61][62][63]

The company also attempted to create products to appeal to specific kinds of viewers. YouTube released a mobile app known as YouTube Kids in 2015 designed to provide an experience optimized for children. It features a simplified user interface, curated selections of channels featuring age-appropriate content, and parental control features.[64] Also in 2015, YouTube launched YouTube Gaming—a video gaming-oriented vertical and app for videos and live streaming, intended to compete with the Amazon.com-owned Twitch.[65]

Consolidation and conflict (2018–present)[edit]

The company was attacked on April 3, 2018, when a shooting took place at YouTube's headquarters in San Bruno, California which wounded three and resulted in two deaths (including the shooter).[66] By February 2017, one billion hours of YouTube were watched every day and 400 hours of video were uploaded every minute.[7][67] Two years later, the uploads increased to more than 500 hours per minute.[8] During the COVID-19 pandemic, when most of the world was under stay-at-home orders, usage of services such as YouTube grew greatly. One data firm estimated that YouTube was accounting for 15% of all internet traffic, twice its pre-pandemic level.[68] In response to EU officials requesting that such services reduce bandwidth as to make sure medical entities had sufficient bandwidth to share information, YouTube along with Netflix stated they would reduce streaming quality for at least thirty days as to cut bandwidth use of their services by 25% to comply with the EU's request.[69] YouTube later announced that they will continue with this move worldwide, "We continue to work closely with governments and network operators around the globe to do our part to minimize stress on the system during this unprecedented situation".[70]

Following a 2018 complaint alleging violations of the Children's Online Privacy Protection Act (COPPA).,[71] the company was fined $170 million by the FTC for collecting personal information from minors under the age of 13.[72] YouTube was also ordered to create systems to increase children's privacy.[73][74] Following criticisms of its implementation of those systems, YouTube started treating all videos designated as "made for kids" as liable under COPPA on January 6, 2020.[75][76] Joining the YouTube kids app, the company created a supervised mode, designed more for tweens in 2021.[77]

During this period, YouTube also got into disputes with other tech companies. For over a year in 2018 and 2019 there was no YouTube app available for Amazon Fire products and in 2020 Roku was forced to remove the app from its streaming store after the two companies couldn't come to an agreement.[78][79]

Videos[edit]

In January 2012, it was estimated that visitors to YouTube spent an average of 15 minutes a day on the site, in contrast to the four or five hours a day spent by a typical US citizen watching television.[80] In 2017, viewers on average watched YouTube on mobile devices for more than an hour every day.[81]

In December 2012, two billion views were removed from the view counts of Universal and Sony music videos on YouTube, prompting a claim by The Daily Dot that the views had been deleted due to a violation of the site's terms of service, which ban the use of automated processes to inflate view counts. This was disputed by Billboard, which said that the two billion views had been moved to Vevo, since the videos were no longer active on YouTube.[82][83] On August 5, 2015, YouTube patched the formerly notorious behaviour which caused a video's view count to freeze at "301" (later "301+") until the actual count was verified to prevent view count fraud.[84] YouTube view counts once again updated in real time.[85]

Since September 2019, subscriber counts are abbreviated. Only three leading digits of channels' subscriber counts are indicated publicly, compromising the function of third-party real-time indicators such as that of Social Blade. Exact counts remain available to channel operators inside YouTube Studio.[86] In March 2021 it was reported that YouTube was testing hiding dislikes on videos for viewers. YouTube creators would still be able to see the number of likes and dislikes in the YouTube Studio dashboard tool, according to YouTube.[87][88]

Copyright[edit]

YouTube has faced numerous challenges and criticisms in its attempts to deal with copyright, including the site's first viral video, Lazy Sunday, which had to be taken down, due to copyright concerns.[27] At the time of uploading a video, YouTube users are shown a message asking them not to violate copyright laws.[89] Despite this advice, many unauthorized clips of copyrighted material remain on YouTube. YouTube does not view videos before they are posted online, and it is left to copyright holders to issue a DMCA takedown notice pursuant to the terms of the Online Copyright Infringement Liability Limitation Act. Any successful complaint about copyright infringement results in a YouTube copyright strike. Three successful complaints for copyright infringement against a user account will result in the account and all of its uploaded videos being deleted.[90][91] From 2007 to 2009 organizations including Viacom, Mediaset, and the English Premier League have filed lawsuits against YouTube, claiming that it has done too little to prevent the uploading of copyrighted material.[92][93][94]

In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders cannot order the removal of an online file without first determining whether the posting reflected fair use of the material.[95] YouTube's owner Google announced in November 2015 that they would help cover the legal cost in select cases where they believe fair use defenses apply.[96]

In the 2011 case of Smith v. Summit Entertainment LLC, professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube.[97] He asserted seven causes of action, and four were ruled in Smith's favor.[98] In April 2012, a court in Hamburg ruled that YouTube could be held responsible for copyrighted material posted by its users.[99] On November 1, 2016, the dispute with GEMA was resolved, with Google content ID being used to allow advertisements to be added to videos with content protected by GEMA.[100]

In April 2013, it was reported that Universal Music Group and YouTube have a contractual agreement that prevents content blocked on YouTube by a request from UMG from being restored, even if the uploader of the video files a DMCA counter-notice.[101][102] As part of YouTube Music, Universal and YouTube signed an agreement in 2017, which was followed by separate agreements other major labels, which gave the company the right to advertising revenue when its music was played on YouTube.[103] By 2019, creators were having videos taken down or demonetized when Content ID identified even short segments of copyrighted music within a much longer video, with different levels of enforcement depending on the record label.[104] Experts noted that some of these clips said qualified for fair use.[104]

Content ID[edit]

In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute.[105] The system, which was initially called "Video Identification"[106][107] and later became known as Content ID,[108] creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found.[109] When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video.

By 2010, YouTube had "already invested tens of millions of dollars in this technology".[107]

In 2011, YouTube described Content ID as "very accurate in finding uploads that look similar to reference files that are of sufficient length and quality to generate an effective ID File".[109]

By 2012, Content ID accounted for over a third of the monetized views on YouTube.[110]

An independent test in 2009 uploaded multiple versions of the same song to YouTube and concluded that while the system was "surprisingly resilient" in finding copyright violations in the audio tracks of videos, it was not infallible.[111] The use of Content ID to remove material automatically has led to controversy in some cases, as the videos have not been checked by a human for fair use.[112] If a YouTube user disagrees with a decision by Content ID, it is possible to fill in a form disputing the decision.[113]

Before 2016, videos were not monetized until the dispute was resolved. Since April 2016, videos continue to be monetized while the dispute is in progress, and the money goes to whoever won the dispute.[114] Should the uploader want to monetize the video again, they may remove the disputed audio in the "Video Manager".[115] YouTube has cited the effectiveness of Content ID as one of the reasons why the site's rules were modified in December 2010 to allow some users to upload videos of unlimited length.[116]

Moderation and offensive content[edit]

YouTube has a set of community guidelines aimed to reduce abuse of the site's features. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines".[117] Generally prohibited material includes sexually explicit content, videos of animal abuse, shock videos, content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior.[117] YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines.[117] Despite the guidelines, YouTube has faced criticism over aspects of its operations,[118] its recommendation algorithms perpetuating videos that promote conspiracy theories and falsehoods,[119] hosting videos ostensibly targeting children but containing violent or sexually suggestive content involving popular characters,[120] videos of minors attracting pedophilic activities in their comment sections,[121] and fluctuating policies on the types of content that is eligible to be monetized with advertising.[118]

YouTube contracts companies to hire content moderators, who view content flagged as potentially violating YouTube's content policies and determines if they should be removed. In September 2020, a class-action suit was filed by a former content moderator who reported developing post-traumatic stress disorder (PTSD) after an 18-month period on the job. The former content moderator said that she was regularly made to exceed YouTube's stated limited of four hours per day of viewing graphic content. The lawsuit alleges that YouTube's contractors gave little to no training or support for its moderator's mental health, made prospective employees sign NDAs before showing them any examples of content they would see while reviewing, and censored all mention of trauma from its internal forums. It also purports that requests for extremely graphic content to be blurred, reduced in size or made monochrome, per recommendations from the National Center for Missing and Exploited Children, were rejected by YouTube as not a high priority for the company.[122][123][124]

To limit the spread of misinformation and fake news via YouTube, it has rolled out a comprehensive policy regarding how to planned to deal with technically manipulated videos.[125]

YouTube has also been criticized for suppressing opinions dissenting from governments' positions, especially related to the COVID-19 pandemic.[126][127][128]

Controversial content has included material relating to Holocaust denial and the Hillsborough disaster, in which 96 football fans from Liverpool were crushed to death in 1989.[129][130] In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content". YouTube responded by stating:

We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly. We educate our community on the rules and include a direct link from every YouTube page to make this process as easy as possible for our users. Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly.[131] (July 2008)

In October 2010, U.S. Congressman Anthony Weiner urged YouTube to remove from its website videos of imam Anwar al-Awlaki.[132] YouTube pulled some of the videos in November 2010, stating they violated the site's guidelines.[133] In December 2010, YouTube added the ability to flag videos for containing terrorism content.[134]

In 2018, YouTube introduced a system that would automatically add information boxes to videos that its algorithms determined may present conspiracy theories and other fake news, filling the infobox with content from Encyclopedia Britannica and Wikipedia as a means to inform users to minimize misinformation propagation without impacting freedom of speech.[135] In the wake of the Notre-Dame de Paris fire on April 15, 2019, several user-uploaded videos of the landmark fire were flagged by YouTube' system automatically with an Encyclopedia Britannica article on the false conspiracy theories around the September 11 attacks. Several users complained to YouTube about this inappropriate connection. YouTube officials apologized for this, stating that their algorithms had misidentified the fire videos and added the information block automatically, and were taking steps to remedy this.[136]

Five leading content creators whose channels were based on LGBTQ+ materials filed a federal lawsuit against YouTube in August 2019, alleging that YouTube's algorithms diverts discovery away from their channels, impacting their revenue. The plaintiffs claimed that the algorithms discourage content with words like "lesbian" or "gay", which would be predominant in their channels' content, and because of YouTube's near-monopolization of online video services, they are abusing that position.[137]

Conspiracy theories and fringe discourse[edit]

YouTube has been criticized for using an algorithm that gives great prominence to videos that promote conspiracy theories, falsehoods and incendiary fringe discourse.[138][139][140] According to an investigation by The Wall Street Journal, "YouTube's recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints."[138][141] When users search for political or scientific terms, YouTube's search algorithms often give prominence to hoaxes and conspiracy theories.[140][142] After YouTube drew controversy for giving top billing to videos promoting falsehoods and conspiracy when people made breaking-news queries during the 2017 Las Vegas shooting, YouTube changed its algorithm to give greater prominence to mainstream media sources.[138][143][144][145] In 2018, it was reported that YouTube was again promoting fringe content about breaking news, giving great prominence to conspiracy videos about Anthony Bourdain's death.[146]

In 2017, it was revealed that advertisements were being placed on extremist videos, including videos by rape apologists, anti-Semites, and hate preachers who received ad payouts.[147] After firms started to stop advertising on YouTube in the wake of this reporting, YouTube apologized and said that it would give firms greater control over where ads got placed.[147]

Alex Jones, known for right-wing conspiracy theories, had built a massive audience on YouTube.[148] YouTube drew criticism in 2018 when it removed a video from Media Matters compiling offensive statements made by Jones, stating that it violated its policies on "harassment and bullying".[149] On August 6, 2018, however, YouTube removed Alex Jones' YouTube page following a content violation.[150]

University of North Carolina professor Zeynep Tufekci has referred to YouTube as "The Great Radicalizer", saying "YouTube may be one of the most powerful radicalizing instruments of the 21st century."[151] Jonathan Albright of the Tow Center for Digital Journalism at Columbia University described YouTube as a "conspiracy ecosystem".[140][152]

In January 2019, YouTube said that it had introduced a new policy starting in the United States intended to stop recommending videos containing "content that could misinform users in harmful ways." YouTube gave flat earth theories, miracle cures, and 9/11 trutherism as examples.[153] Efforts within YouTube engineering to stop recommending borderline extremist videos falling just short of forbidden hate speech, and track their popularity were originally rejected because they could interfere with viewer engagement.[154] In late 2019, the site began implementing measures directed towards "raising authoritative content and reducing borderline content and harmful misinformation."[155]

A July 2019 study based on ten YouTube searches using the Tor Browser related to the climate and climate change, the majority of videos were videos that communicated views contrary to the scientific consensus on climate change.[156]

A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures.[157] In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories.[158]

Following the dissemination via YouTube of misinformation related to the COVID-19 pandemic that 5G communications technology was responsible for the spread of coronavirus disease 2019 which led to numerous 5G towers in the United Kingdom to be destroyed, YouTube removed all such videos linking 5G and the coronavirus in this manner.[159]

Hateful content[edit]

Before 2019, YouTube has taken steps to remove specific videos or channels related to supremacist content that had violated its acceptable use policies but otherwise did not have site-wide policies against hate speech.[160]

In the wake of the March 2019 Christchurch mosque attacks, YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks.[161][162] These platforms were pressured to remove such content, but in an interview with The New York Times, YouTube's chief product officer Neal Mohan said that unlike content such as ISIS videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction.[163]

YouTube joined an initiative led by France and New Zealand with other countries and tech companies in May 2019 to develop tools to be used to block online hate speech and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate.[164][165] Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service, "specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status." YouTube identified specific examples of such videos as those that "promote or glorify Nazi ideology, which is inherently discriminatory". YouTube further stated it would "remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place."[160][166]

In June 2020, YouTube banned several channels associated with white supremacy, including those of Stefan Molyneux, David Duke, and Richard B. Spencer, asserting these channels violated their policies on hate speech. The ban occurred the same day that Reddit announced the ban on several hate speech sub-forums including r/The_Donald.[167]

Child protection[edit]

Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%.[168][169] However, with the increase in videos featuring children, the site began to face several controversies related to child safety. During Q2 2017, the owners of popular channel FamilyOFive, which featured themselves playing "pranks" on their children, were accused of child abuse. Their videos were eventually deleted, and two of their children were removed from their custody.[170][171][172][173] A similar case happened in 2019 when the owners of the channel Fantastic Adventures was accused of abusing her adopted children. Her videos would later be deleted.[174]

Later that year, YouTube came under criticism for showing inappropriate videos targeted at children and often featuring popular characters in violent, sexual or otherwise disturbing situations, many of which appeared on YouTube Kids and attracted millions of views. The term "Elsagate" was coined on the Internet and then used by various news outlets to refer to this controversy.[175][176][177][178] On November 11, 2017, YouTube announced it was strengthening site security to protect children from unsuitable content. Later that month, the company started to mass delete videos and channels that made improper use of family friendly characters. As part as a broader concern regarding child safety on YouTube, the wave of deletions also targeted channels which showed children taking part in inappropriate or dangerous activities under the guidance of adults. Most notably, the company removed Toy Freaks, a channel with over 8.5 million subscribers, that featured a father and his two daughters in odd and upsetting situations.[179][180][181][182][183] According to analytics specialist SocialBlade, it earned up to £8.7 million annually prior to its deletion.[184]

Even for content that appears to be aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content.[185] Alternative, some of the most-watched children's programming on YouTube comes from channels who have no identifiable owners, raising concerns of intent and purpose. One channel that had been of concern was "Cocomelon" which provided numerous mass-produced animated videos aimed at children. Up through 2019, it had drawn up to US$10 million a month in ad revenue, and was one of the largest kid-friendly channels on YouTube before 2020. Ownership of Cocomelon was unclear outside of its ties to "Treasure Studio", itself an unknown entity, raising questions as to the channel's purpose,[185][186][187] but Bloomberg News had been able to confirm and interview the small team of American owners in February 2020 regarding "Cocomelon", who stated their goal for the channel was to simply entertain children, wanting to keep to themselves to avoid attention from outside investors.[188] The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve.[189] The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the Campaign for a Commercial-Free Childhood, and educational consultant Renée Chernow-O’Leary found the videos were designed to entertain with no intent to educate, all leading to both critics and parents to be concerns for their children becoming too enraptured by the content from these channels.[185] Content creators that earnestly make kid-friendly videos have found it difficult to compete with larger channels like ChuChu TV, unable to produce content at the same rate as these large channels, and lack the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared.[189]

In January 2019, YouTube officially banned videos containing "challenges that encourage acts that have an inherent risk of severe physical harm" (such as, for example, the Tide Pod Challenge), and videos featuring pranks that "make victims believe they're in physical danger" or cause emotional distress in children.[190]

Sexualization of children[edit]

Also in November 2017, it was revealed in the media that many videos featuring children—often uploaded by the minors themselves, and showing innocent content such as the children playing with toys or performing gymnastics—were attracting comments from pedophiles[191][192] with predators finding the videos through private YouTube playlists or typing in certain keywords in Russian.[192] Other child-centric videos originally uploaded to YouTube began propagating on the dark web, and uploaded or embedded onto forums known to be used by pedophiles.[193]

As a result of the controversy, which added to the concern about "Elsagate", several major advertisers whose ads had been running against such videos froze spending on YouTube.[178][194] In December 2018, The Times found more than 100 grooming cases in which children were manipulated into sexually implicit behavior (such as taking off clothes, adopting sexualised poses and touching other children inappropriately) by strangers.[195] After a reporter flagged the videos in question, half of them were removed, and the rest were removed after The Times contacted YouTube's PR department.[195]

In February 2019, YouTube vlogger Matt Watson identified a "wormhole" that would cause the YouTube recommendation algorithm to draw users into this type of video content, and make all of that user's recommended content feature only these types of videos. Most of these videos had comments from sexual predators commenting with timestamps of when the children were shown in compromising positions, or otherwise making indecent remarks. In some cases, other users had reuploaded the video in unlisted form but with incoming links from other videos, and then monetized these, propagating this network.[196] In the wake of the controversy, the service reported that they had deleted over 400 channels and tens of millions of comments, and reported the offending users to law enforcement and the National Center for Missing and Exploited Children. A spokesperson explained that "any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. There's more to be done, and we continue to work to improve and catch abuse more quickly."[197][198] Despite these measures, AT&T, Disney, Dr. Oetker, Epic Games, and Nestlé all pulled their advertising from YouTube.[196][199]

Subsequently, YouTube began to demonetize and block advertising on the types of videos that have drawn these predatory comments. The service explained that this was a temporary measure while they explore other methods to eliminate the problem.[200] YouTube also began to flag channels that predominantly feature children, and preemptively disable their comments sections. "Trusted partners" can request that comments be re-enabled, but the channel will then become responsible for moderating comments. These actions mainly target videos of toddlers, but videos of older children and teenagers may be protected as well if they contain actions that can be interpreted as sexual, such as gymnastics. YouTube stated it was also working on a better system to remove comments on other channels that matched the style of child predators.[201][202]

A related attempt to algorithmically flag videos containing references to the string "CP" (an abbreviation of child pornography) resulted in a number of prominent false positives involving unrelated topics using the same abbreviation, including videos related to the mobile video game Pokémon Go (which uses "CP" as an abbreviation of the statistic "Combat Power"), and Club Penguin. YouTube apologized for the errors, and reinstated the affected videos.[203] Separately, online trolls have attempted to have videos flagged for takedown or removal by commenting with statements similar to what the child predators had said; this activity became an issue during the PewDiePie vs T-Series rivalry in early 2019. YouTube stated they do not take action on any video with these comments but those that they have flagged that are likely to draw child predator activity.[204]

In June 2019, The New York Times cited researchers who found that users who watched erotic videos could be recommended seemingly innocuous videos of children.[205] As a result, Senator Josh Hawley stated plans to introduce federal legislation that would ban YouTube and other video sharing sites from including videos that predominantly feature minors as "recommended" videos, excluding those that were "professionally produced", such as videos of televised talent shows.[206] YouTube has suggested potential plans to remove all videos featuring children from the main YouTube site and transferring them to the YouTube Kids site where they would have stronger controls over the recommendation system, as well as other major changes on the main YouTube site to the recommended feature and autoplay system.[207]

April Fools[edit]

YouTube featured an April Fools prank on the site on April 1 of every year from 2008 to 2016. In 2008, all links to videos on the main page were redirected to Rick Astley's music video "Never Gonna Give You Up", a prank known as "rickrolling".[208][209] The next year, when clicking on a video on the main page, the whole page turned upside down, which YouTube claimed was a "new layout".[210] In 2010, YouTube temporarily released a "TEXTp" mode which rendered video imagery into ASCII art letters "in order to reduce bandwidth costs by $1 per second."[211]

The next year, the site celebrated its "100th anniversary" with a range of sepia-toned silent, early 1900s-style films, including a parody of Keyboard Cat.[212] In 2012, clicking on the image of a DVD next to the site logo led to a video about a purported option to order every YouTube video for home delivery on DVD.[213]

In 2013, YouTube teamed up with satirical newspaper company The Onion to claim in an uploaded video that the video-sharing website was launched as a contest which had finally come to an end, and would shut down for ten years before being re-launched in 2023, featuring only the winning video. The video starred several YouTube celebrities, including Antoine Dodson. A video of two presenters announcing the nominated videos streamed live for 12 hours.[214][215]

In 2014, YouTube announced that it was responsible for the creation of all viral video trends, and revealed previews of upcoming trends, such as "Clocking", "Kissing Dad", and "Glub Glub Water Dance".[216] The next year, YouTube added a music button to the video bar that played samples from "Sandstorm" by Darude.[217] In 2016, YouTube introduced an option to watch every video on the platform in 360-degree mode with Snoop Dogg.[218]

References[edit]

  1. 1.0 1.1 Cite error: Invalid <ref> tag; no text was provided for refs named 2020revenue
  2. Más de 2000 millones de usuarios esta es la cantidad de usuarios de YouTube, que equivale casi a un tercio de los usuarios de Internet (in Spanish)
  3. Claburn, Thomas (January 5, 2017). "Google's Grumpy code makes Python Go". The Register. Retrieved September 16, 2017.
  4. Wilson, Jesse (May 19, 2009). "Guice Deuce". Official Google Code Blog. Retrieved March 25, 2017.
  5. "YouTube Architecture". High Scalability. Retrieved October 13, 2014.
  6. "Golang Vitess: a database wrapper written in Go as used by Youtube". October 23, 2018.
  7. 7.0 7.1 Goodrow, Cristos (February 27, 2017). "You know what's cool? A billion hours". YouTube. Retrieved April 19, 2021.
  8. 8.0 8.1 Loke Hale, James (May 7, 2019). "More Than 500 Hours Of Content Are Now Being Uploaded To YouTube Every Minute". TubeFilter. Los Angeles, CA. Retrieved June 10, 2019.
  9. 9.0 9.1 Helft, Miguel; Richtel, Matt (October 10, 2006). "Venture Firm Shares a YouTube Jackpot". The New York Times. Retrieved March 26, 2017.
  10. "YouTube founders now superstars". The Sydney Morning Herald. October 11, 2006. Retrieved March 18, 2021.
  11. 11.0 11.1 Cloud, John (December 25, 2006). "The YouTube Gurus". Time. Retrieved March 26, 2017.
  12. Hopkins, Jim (October 11, 2006). "Surprise! There's a third YouTube co-founder". USA Today. Retrieved March 26, 2017.
  13. Earliest surviving version of the YouTube website Wayback Machine, April 28, 2005. Retrieved June 19, 2013.
  14. "r | p 2006: YouTube: From Concept to Hypergrowth – Jawed Karim".
  15. Dredge, Stuart (March 16, 2016). "YouTube was meant to be a video-dating website". The Guardian. Retrieved March 15, 2019.
  16. Helft, Miguel (October 12, 2006). "San Francisco Hedge Fund Invested in YouTube". The New York Times. No. Vol.156, Issue 53, 730.
  17. Kehaulani Goo, Sara (October 7, 2006). "Ready for Its Close-Up". The Washington Post. Retrieved March 26, 2017.
  18. "Whois Record for www.youtube.com". DomainTools. Retrieved April 1, 2009.
  19. Alleyne, Richard (July 31, 2008). "YouTube: Overnight success has sparked a backlash". The Daily Telegraph. Retrieved March 26, 2017.
  20. "Me at the zoo". YouTube. April 23, 2005. Retrieved August 3, 2009.
  21. "Ronaldinho: Touch of Gold – YouTube". Wayback Machine. November 25, 2005. Archived from the original on November 25, 2005. Retrieved January 1, 2017.
  22. "Most Viewed – YouTube". Wayback Machine. November 2, 2005. Archived from the original on November 2, 2005. Retrieved January 1, 2017.
  23. "YouTube: a history". The Daily Telegraph. April 17, 2010. Retrieved March 26, 2017.
  24. Dickey, Megan Rose (February 15, 2013). "The 22 Key Turning Points in the History of YouTube". Business Insider. Retrieved March 25, 2017.
  25. Graham, Jefferson (November 21, 2005). "Video websites pop up, invite postings". USA Today. Retrieved March 26, 2017.
  26. 26.0 26.1 Pullen, John Patrick (February 23, 2011). "How Vimeo became hipster YouTube". Fortune. Retrieved May 8, 2020.
  27. 27.0 27.1 Novak, Matt (February 14, 2020). "Here's What People Thought of YouTube When It First Launched in the Mid-2000s". Gizmodo. Retrieved February 14, 2020.
  28. Biggs, John (February 20, 2006). "A Video Clip Goes Viral, and a TV Network Wants to Control It". The New York Times. Retrieved February 14, 2020.
  29. Wallenstein, Andrew; Spangler, Todd (December 18, 2015). "'Lazy Sunday' Turns 10: 'SNL' Stars Recall How TV Invaded the Internet". Variety. Retrieved April 27, 2019.
  30. Higgens, Bill (October 5, 2017). "Hollywood Flashback: 'SNL's' 'Lazy Sunday' Put YouTube on the Map in 2005". The Hollywood Reporter. Retrieved April 27, 2019.
  31. "YouTube serves up 100 million videos a day online". USA Today. July 16, 2006. Retrieved March 26, 2017.
  32. Zappone, Christian (October 12, 2006). "Help! YouTube is killing my business!". CNN. Retrieved November 29, 2008.
  33. Blakely, Rhys (November 2, 2006). "Utube sues YouTube". The Times. London. Retrieved November 29, 2008.
  34. La Monica, Paul R. (October 9, 2006). "Google to buy YouTube for $1.65 billion". CNNMoney. CNN. Retrieved March 26, 2017.
  35. Arrington, Michael (October 9, 2006). "Google Has Acquired YouTube". TechCrunch. AOL. Retrieved March 26, 2017.
  36. Arrington, Michael (November 13, 2006). "Google Closes YouTube Acquisition". TechCrunch. AOL. Retrieved March 26, 2017.
  37. "Google closes $A2b YouTube deal". The Age. November 14, 2006. Archived from the original on December 20, 2007. Retrieved March 26, 2017.
  38. Carter, Lewis (April 7, 2008). "Web could collapse as video demand soars". The Daily Telegraph. Retrieved March 26, 2017.
  39. "comScore Releases May 2010 U.S. Online Video Rankings". comScore. Retrieved June 27, 2010.
  40. "YouTube redesigns website to keep viewers captivated". AFP. Retrieved April 1, 2010.
  41. "YouTube moves past 3 billion views a day". CNET. CBS Interactive. May 25, 2011. Retrieved March 26, 2017.
  42. Bryant, Martin (May 25, 2011). "YouTube hits 3 Billion views per day, 2 DAYS worth of video uploaded every minute". The Next Web. Retrieved March 26, 2017.
  43. Oreskovic, Alexei (January 23, 2012). "Exclusive: YouTube hits 4 billion daily video views". Reuters. Thomson Reuters. Retrieved March 26, 2017.
  44. Whitelaw, Ben (April 20, 2011). "Almost all YouTube views come from just 30% of films". The Daily Telegraph. Retrieved March 26, 2017.
  45. "YouTube's website redesign puts the focus on channels". BBC. December 2, 2011. Retrieved December 2, 2011.
  46. Cashmore, Pete (October 26, 2006). "YouTube Gets New Logo, Facelift and Trackbacks – Growing Fast!". Retrieved December 2, 2011.
  47. “YouTube rolls out redesigned ‘One Channel’ layout to all users” (TheNextWeb article, June 5, 2013).
  48. Welch, Chris (May 19, 2013). "YouTube users now upload 100 hours of video every minute". The Verge. Vox Media. Retrieved March 26, 2017.
  49. E. Solsman, Joan (November 12, 2014). "YouTube's Music Key: Can paid streaming finally hook the masses?". CNET. CBS Interactive. Retrieved March 25, 2017.
  50. Wasserman, Todd (February 15, 2015). "The revolution wasn't televised: The early days of YouTube". Mashable. Retrieved July 4, 2018.
  51. "Hurley stepping down as YouTube chief executive". AFP. October 29, 2010. Retrieved October 30, 2010.
  52. Oreskovic, Alexei (February 5, 2014). "Google taps longtime executive Wojcicki to head YouTube". Reuters. Retrieved September 16, 2017.
  53. Avalos, George (January 20, 2016). "YouTube expansion in San Bruno signals big push by video site". San Jose Mercury News. Retrieved February 3, 2016.
  54. "YouTube has a new look and, for the first time, a new logo". The Verge. Retrieved May 7, 2018.
  55. "YouTube launches pay-to-watch subscription channels". BBC News. May 9, 2013. Retrieved May 11, 2013.
  56. Nakaso, Dan (May 7, 2013). "YouTube providers could begin charging fees this week". Mercury News. Retrieved May 10, 2013.
  57. "Paid content discontinued January 1, 2018 - YouTube Help". support.google.com. Retrieved April 19, 2021.
  58. Browne, Ryan (June 22, 2018). "YouTube introduces paid subscriptions and merchandise selling in bid to help creators monetize the platform". CNBC. Retrieved April 19, 2021.
  59. Parker, Laura (April 12, 2017). "A Chat With a Live Streamer Is Yours, for a Price". The New York Times. Retrieved April 21, 2018.
  60. "YouTube announces plans for a subscription music service". The Verge. Retrieved May 17, 2018.
  61. Reader, Ruth (October 21, 2015). "Google wants you to pay $9.99 per month for ad-free YouTube". Venturebeat. Retrieved October 22, 2015.
  62. "Exclusive: An inside look at the new ad-free YouTube Red". The Verge. Retrieved May 17, 2018.
  63. "YouTube Music isn't perfect, but it's still heaven for music nerds". Engadget. Retrieved November 7, 2016.
  64. Perez, Sarah (February 23, 2015). "Hands on With "YouTube Kids," Google's Newly Launched, Child-Friendly YouTube App". TechCrunch. AOL. Retrieved March 26, 2017.
  65. Dredge, Stuart (August 26, 2015). "Google launches YouTube Gaming to challenge Amazon-owned Twitch". The Guardian. Retrieved September 5, 2015.
  66. "YouTube shooting: Suspect visited shooting range before attack". BBC News. April 4, 2018. Retrieved April 9, 2018.
  67. Lumb, David (February 27, 2017). "One billion hours of YouTube are watched every day". Engadget. AOL. Retrieved March 26, 2017.
  68. Rouse, Kevin (June 4, 2020). "Rabbit Hole, episode Eight: 'We Go All'". The New York Times. ISSN 0362-4331. Retrieved May 10, 2021.
  69. Gold, Hadas (March 19, 2020). "Netflix and YouTube are slowing down in Europe to keep the internet from breaking". CNN. Retrieved March 20, 2020.
  70. "YouTube is reducing the quality of videos for the next month — and it's because increased traffic amid the coronavirus outbreak is straining internet bandwidth". Business Insider. Retrieved March 24, 2020.
  71. Spangler, Todd (April 9, 2018). "YouTube Illegally Tracks Data on Kids, Groups Claim in FTC Complaint". Variety. Retrieved April 27, 2018.
  72. Mike, Masnick. "FTC's Latest Fine Of YouTube Over COPPA Violations Shows That COPPA And Section 230 Are On A Collision Course". Techdirt. Retrieved September 7, 2019.
  73. Kelly, Makena (September 4, 2019). "Google will pay $170 million for YouTube's child privacy violations". The Verge. Retrieved September 4, 2019.
  74. Fung, Brian. "Google and FTC reach $170 million settlement over alleged YouTube violations of kids' privacy". CNN Business. Retrieved September 4, 2019.
  75. Matthews, David (January 6, 2020). "YouTube rolls out new controls aimed at controlling children's content". TechSpot. Retrieved January 9, 2020.
  76. Kelly, Makena (December 11, 2019). "YouTube calls for 'more clarity' on the FTC's child privacy rules". The Verge. Retrieved December 11, 2019.
  77. Spangler, Todd; Spangler, Todd (February 24, 2021). "YouTube New 'Supervised' Mode Will Let Parents Restrict Older Kids' Video Viewing". Variety. Retrieved April 19, 2021.
  78. Welch, Chris (April 18, 2019). "YouTube is finally coming back to Amazon's Fire TV devices". The Verge. Retrieved May 5, 2021.
  79. Solsman, Joan E. "Roku: YouTube TV app removed from channel store as deal with Google ends". CNET. Retrieved May 5, 2021.
  80. Seabrook, John (January 16, 2012). "Streaming Dreams". The New Yorker. Retrieved January 6, 2012.
  81. "Updates from VidCon: more users, more products, more shows and much more". Official YouTube Blog. Retrieved September 16, 2017.
  82. Hoffberger, Chase (December 21, 2012). "YouTube strips Universal and Sony of 2 billion fake views". The Daily Dot. Complex Media, Inc. Retrieved January 10, 2014.
  83. Sabbagh, Dan (December 28, 2012). "Two billion YouTube music video views disappear ... or just migrate?". The Guardian. Guardian News and Media Limited. Retrieved January 10, 2014.
  84. Haran, Brady (June 22, 2012). Why do YouTube views freeze at 301?. Numberphile. Retrieved August 30, 2018 – via YouTube.
  85. Snyder, Benjamin (August 6, 2015). "YouTube Finally Fixed This Annoying Feature". Time. Retrieved March 26, 2017.
  86. "Abbreviated public-facing subscriber counts". YouTube Engineering and Developers Blog. 2019.
  87. Spangler, Todd (March 30, 2021). "YouTube Launches Test to Hide Video 'Dislike' Counts". Variety. Retrieved March 30, 2021.
  88. "YouTube tests hiding dislike counts on videos". TechCrunch. Retrieved March 30, 2021.
  89. Marsden, Rhodri (August 12, 2009). "Why did my YouTube account get closed down?". The Independent. London. Retrieved August 12, 2009.
  90. Why do I have a sanction on my account? YouTube. Retrieved February 5, 2012.
  91. "Is YouTube's three-strike rule fair to users?". BBC News. London. May 21, 2010. Retrieved February 5, 2012.
  92. "Viacom will sue YouTube for $1bn". BBC News. March 13, 2007. Retrieved May 26, 2008.
  93. "Mediaset Files EUR500 Million Suit Vs Google's YouTube". CNNMoney.com. July 30, 2008. Retrieved August 19, 2009.
  94. "Premier League to take action against YouTube". The Daily Telegraph. May 5, 2007. Retrieved March 26, 2017.
  95. Egelko, Bob (August 20, 2008). "Woman can sue over YouTube clip de-posting". San Francisco Chronicle. Retrieved August 25, 2008.
  96. Finley, Klint (November 19, 2015). "Google Pledges to Help Fight Bogus YouTube Copyright Claims—for a Few". Wired. Retrieved March 25, 2017.
  97. Ohio Northern District Court (July 18, 2013). "Court Docket". Smith v. Summit Entertainment LLC. Docket Alarm, Inc. Retrieved October 21, 2014.
  98. District Judge James G. Carr (June 6, 2011). "Order". Smith v. Summit Entertainment LLC. United States District Court, N.D. Ohio, Western Division. Retrieved November 7, 2011.
  99. "YouTube loses court battle over music clips". BBC News. London. April 20, 2012. Retrieved April 20, 2012.
  100. "YouTube's seven-year stand-off ends". BBC News. London. November 1, 2016. Retrieved November 2, 2016.
  101. "YouTube's Deal With Universal Blocks DMCA Counter Notices". TorrentFreak. April 5, 2013. Retrieved April 5, 2013.
  102. "Videos removed or blocked due to YouTube's contractual obligations". Retrieved April 5, 2013.
  103. Aswad, Jem; Aswad, Jem (December 19, 2017). "YouTube Strikes New Deals With Universal and Sony Music". Variety. Retrieved April 22, 2021.
  104. 104.0 104.1 Alexander, Julia (May 24, 2019). "YouTubers and record labels are fighting, and record labels keep winning". The Verge. Retrieved April 22, 2021.
  105. Delaney, Kevin J. (June 12, 2007). "YouTube to Test Software To Ease Licensing Fights". Wall Street Journal. Retrieved December 4, 2011.
  106. YouTube Advertisers (February 4, 2008), Video Identification, retrieved August 29, 2018
  107. 107.0 107.1 King, David (December 2, 2010). "Content ID turns three". Official YouTube Blog. Retrieved August 29, 2018.
  108. "YouTube Content ID". YouTube. September 28, 2010. Retrieved May 25, 2015.
  109. 109.0 109.1 More about Content ID YouTube. Retrieved December 4, 2011.
  110. Press Statistics YouTube. Retrieved March 13, 2012.
  111. Von Lohmann, Fred (April 23, 2009). "Testing YouTube's Audio Content ID System". Retrieved December 4, 2011.
  112. Von Lohmann, Fred (February 3, 2009). "YouTube's January Fair Use Massacre". Retrieved December 4, 2011.
  113. Content ID disputes YouTube. Retrieved December 4, 2011.
  114. Hernandez, Patricia. "YouTube's Content ID System Gets One Much-Needed Fix". Kotaku. Retrieved September 16, 2017.
  115. "Remove Content ID claimed songs from my videos – YouTube Help". support.google.com. Retrieved September 17, 2017.
  116. Siegel, Joshua; Mayle, Doug (December 9, 2010). "Up, Up and Away – Long videos for more users". Official YouTube Blog. Retrieved March 25, 2017.
  117. 117.0 117.1 117.2 "YouTube Community Guidelines". YouTube. Archived from the original on March 4, 2017. Retrieved November 30, 2008.[better source needed]
  118. 118.0 118.1 Alexander, Julia (May 10, 2018). "The Yellow $: a comprehensive history of demonetization and YouTube's war with creators". Polygon. Retrieved November 3, 2019.
  119. Wong, Julia Carrie; Levin, Sam (January 25, 2019). "YouTube vows to recommend fewer conspiracy theory videos". The Guardian. ISSN 0261-3077. Retrieved November 3, 2019.
  120. Orphanides, K. G. (March 23, 2018). "Children's YouTube is still churning out blood, suicide and cannibalism". Wired UK. ISSN 1357-0978. Retrieved November 3, 2019.
  121. Orphanides, K. G. (February 20, 2019). "On YouTube, a network of paedophiles is hiding in plain sight". Wired UK. ISSN 1357-0978. Retrieved November 3, 2019.
  122. Kimball, Whitney (September 22, 2020). "Content Moderator Exposed to Child Assault and Animal Torture Sues YouTube". Gizmodo. Retrieved October 11, 2020.
  123. Vincent, James (September 22, 2020). "Former YouTube content moderator sues the company after developing symptoms of PTSD". The Verge. Retrieved October 11, 2020.
  124. Elias, Jennifer (September 22, 2020). "Former YouTube content moderator describes horrors of the job in new lawsuit". CNBC. Retrieved October 11, 2020.
  125. Alba, Davey (February 3, 2020). "YouTube Says It Will Ban Misleading Election-Related Content". The New York Times. Retrieved February 10, 2020.
  126. Wright, Mike (May 21, 2020). "YouTube accused of 'censorship' for removing video claiming Covid-19 could 'burn out' before vaccine". The Telegraph. Retrieved May 28, 2020.
  127. Kaminska, Izabella (May 27, 2020). "Censortech strikes again". FT Alphaville. Retrieved May 28, 2020.
  128. "Letter to YouTube from BBW" (PDF). Big Brother Watch. Retrieved May 28, 2020.
  129. "YouTube criticized in Germany over anti-Semitic Nazi videos". Reuters. Retrieved May 28, 2008.
  130. "Fury as YouTube carries sick Hillsboro video insult". icLiverpool. Archived from the original on March 20, 2012. Retrieved November 29, 2015.
  131. Kirkup, James; Martin, Nicole (July 31, 2008). "YouTube attacked by MPs over sex and violence footage". The Daily Telegraph. Retrieved March 26, 2017.
  132. "Al-Awlaki's YouTube Videos Targeted by Rep. Weiner". Fox News. October 25, 2010. Retrieved November 13, 2010.
  133. F. Burns, John; Helft, Miguel (November 4, 2010). "YouTube Withdraws Cleric's Videos". The New York Times. Retrieved March 26, 2017.
  134. Bennett, Brian (December 12, 2010). "YouTube is letting users decide on terrorism-related videos". Los Angeles Times. Retrieved November 29, 2015.
  135. Newton, Casey (March 13, 2018). "YouTube will add information from Wikipedia to videos about conspiracies". The Verge. Retrieved April 15, 2019.
  136. Bergen, Mark (April 15, 2019). "YouTube Flags Notre-Dame Fire as 9/11 Conspiracy, Says System Made 'Wrong Call'". Bloomberg L.P. Retrieved April 15, 2019.
  137. Bensinger, Greg; Albergotti, Reed (August 14, 2019). "YouTube discriminates against LGBT content by unfairly culling it, suit alleges". The Washington Post. Retrieved August 14, 2019.
  138. 138.0 138.1 138.2 Nicas, Jack (February 7, 2018). "How YouTube Drives People to the Internet's Darkest Corners". Wall Street Journal. ISSN 0099-9660. Retrieved June 16, 2018.
  139. "As Germans Seek News, YouTube Delivers Far-Right Tirades". Retrieved September 8, 2018.
  140. 140.0 140.1 140.2 Ingram, Matthew. "YouTube's secret life as an engine for right-wing radicalization". Columbia Journalism Review. No. September 19, 2018. Retrieved March 26, 2019.
  141. Lewis, Rebecca (September 2018). "Alternative Influence: Broadcasting the Reactionary Right on YouTube" (PDF). datasociety.net. Data and Society. Retrieved March 26, 2019.
  142. "YouTube wants the news audience, but not the responsibility". Columbia Journalism Review. Retrieved September 23, 2018.
  143. Nicas, Jack (October 6, 2017). "YouTube Tweaks Search Results as Las Vegas Conspiracy Theories Rise to Top". Wall Street Journal. ISSN 0099-9660. Retrieved June 16, 2018.
  144. "Here's How YouTube Is Spreading Conspiracy Theories About The Vegas Shooting". BuzzFeed. Retrieved June 16, 2018.
  145. "The Big Tech Platforms Still Suck During Breaking News". BuzzFeed. Retrieved June 16, 2018.
  146. "YouTube Is Spreading Conspiracy Theories about Anthony Bourdain's Death". BuzzFeed. Retrieved June 16, 2018.
  147. 147.0 147.1 "Google apologises as M&S pulls ads". BBC News. March 20, 2017. Retrieved June 16, 2018.
  148. Lewis, Paul (February 2, 2018). "'Fiction is outperforming reality': how YouTube's algorithm distorts truth". The Guardian. ISSN 0261-3077. Retrieved June 16, 2018.
  149. Levin, Sam (April 23, 2018). "YouTube under fire for censoring video exposing conspiracy theorist Alex Jones". The Guardian. Retrieved June 16, 2018.
  150. Salinas, Sara (August 6, 2018). "YouTube removes Alex Jones' page, following bans from Apple and Facebook." CNBC. Retrieved October 15, 2018.
  151. "Opinion | YouTube, the Great Radicalizer". Retrieved June 16, 2018.
  152. "Parkland shooting 'crisis actor' videos lead users to a 'conspiracy ecosystem' on YouTube, new research shows". Washington Post. Retrieved September 23, 2018.
  153. Weill, Kelly (January 25, 2019). "YouTube Tweaks Algorithm to Fight 9/11 Truthers, Flat Earthers, Miracle Cures". Retrieved January 29, 2019.
  154. Bergen, Mark (April 2, 2019). "YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant". Bloomberg News. Retrieved April 2, 2019.
  155. "The Four Rs of Responsibility, Part 2: Raising authoritative content and reducing borderline content and harmful misinformation". Official YouTube Blog. YouTube. Retrieved January 31, 2020.
  156. Allgaier, Joachim (July 25, 2019). "Science and Environmental Communication on YouTube: Strategically Distorted Communications in Online Videos on Climate Change and Climate Engineering". Frontiers in Communication. 4. doi:10.3389/fcomm.2019.00036. ISSN 2297-900X.
  157. Carmichael, Flora; News, Juliana Gragnani Beyond Fake; Monitoring, B.B.C. "How YouTube makes money from fake cancer cure videos". BBC News. Retrieved September 27, 2019. {{cite web}}: |last2= has generic name (help)
  158. Fisher, Max; Taub, Amanda (August 11, 2019). "How YouTube Radicalized Brazil". The New York Times. ISSN 0362-4331. Retrieved August 12, 2019.
  159. Hern, Alex (April 5, 2020). "YouTube moves to limit spread of false coronavirus 5G theory". The Guardian. Retrieved April 5, 2020.
  160. 160.0 160.1 "Our ongoing work to tackle hate". YouTube. June 5, 2019. Retrieved April 9, 2020.
  161. Robertson, Adi (March 15, 2019). "Questions about policing online hate are much bigger than Facebook and YouTube". The Verge. Retrieved April 9, 2020.
  162. Timberg, Craig; Harwell, Drew; Shaban, Hamza; Ba Tran, Andrew; Fung, Brian (March 15, 2020). "The New Zealand shooting shows how YouTube and Facebook spread hate and violent images – yet again". The Washington Post. Retrieved April 9, 2020.
  163. Roose, Kevin (March 29, 2019). "YouTube's Product Chief on Online Radicalization and Algorithmic Rabbit Holes". The New York Times. Retrieved April 9, 2020.
  164. Browne, Ryan (May 15, 2019). "New Zealand and France unveil plans to tackle online extremism without the US on board". CNBC. Retrieved April 9, 2020.
  165. Willsher, Kim (May 15, 2019). "Leaders and tech firms pledge to tackle extremist violence online". The Guardian. Retrieved April 9, 2020.
  166. Newton, Casey (June 5, 2019). "YouTube just banned supremacist content, and thousands of channels are about to be removed". The Verge. Retrieved April 9, 2020.
  167. Alexander, Julia (June 29, 2020). "YouTube bans Stefan Molyneux, David Duke, Richard Spencer, and more for hate speech". The Verge. Retrieved June 29, 2020.
  168. Luscombe, Belinda (May 18, 2017). "The YouTube Parents Who are Turning Family Moments into Big Bucks". Time. Retrieved June 21, 2019.
  169. Alexander, Julia (June 21, 2019). "YouTube can't remove kid videos without tearing a hole in the entire creator ecosystem". The Verge. Retrieved June 21, 2019.
  170. Ohlheiser, Abby (April 26, 2017). "The saga of a YouTube family who pulled disturbing pranks on their own kids". The Washington Post.
  171. Cresci, Elena (May 7, 2017). "Mean stream: how YouTube prank channel DaddyOFive enraged the internet". The Guardian. ISSN 0261-3077. Retrieved June 7, 2017.
  172. Dunphy, Rachel (April 28, 2017). "The Abusive 'Pranks' of YouTube Family Vloggers". Select All. New York Magazine. Retrieved July 9, 2017.
  173. Gajanan, Mahita (May 3, 2017). "YouTube Star DaddyOFive Loses Custody of 2 Children Shown in 'Prank' Videos". Time. Retrieved July 9, 2017.
  174. Eric Levenson and Mel Alonso. "A mom on a popular YouTube show is accused of pepper-spraying her kids when they flubbed their lines". CNN.
  175. Ben Popper, Adults dressed as superheroes is YouTube's new, strange, and massively popular genre, The Verge, February 4, 2017
  176. "Report: Thousands of videos mimicking popular cartoons on YouTube Kids contain inappropriate content". NEWS10 ABC. March 31, 2017. Retrieved April 30, 2017.
  177. Sapna Maheshwari, On YouTube Kids, Startling Videos Slip Past Filters, The New York Times, November 4, 2017
  178. 178.0 178.1 Dani Di Placido, YouTube's "Elsagate" Illuminates The Unintended Horrors Of The Digital Age, Forbes, November 28, 2017
  179. Todd Spangler, YouTube Terminates Toy Freaks Channel Amid Broader Crackdown on Disturbing Kids’ Content, Variety, November 17, 2017
  180. Popper, Ben (November 9, 2017). "YouTube says it will crack down on bizarre videos targeting children". The Verge. Archived from the original on November 16, 2017. In August of this year, YouTube announced that it would no longer allow creators to monetize videos which "made inappropriate use of family friendly characters." Today it's taking another step to try to police this genre.
  181. Sarah Templeton, Disturbing 'ElsaGate', 'Toy Freaks' videos removed from YouTube after abuse allegations, Newshub, November 22, 2017
  182. YouTube to crack down on videos showing child endangerment, ABC News, November 22, 2017
  183. Charlie Warzel, YouTube Is Addressing Its Massive Child Exploitation Problem BuzzFeed, November 22, 2017
  184. Bridge, Mark; Mostrous, Alexi (November 18, 2017). "Child abuse on YouTube". The Times. Retrieved November 28, 2017.
  185. 185.0 185.1 185.2 Koh, Yoree; Morris, Betsy (April 11, 2019). "Kids Love These YouTube Channels. Who Creates Them Is a Mystery". Archived from the original on August 14, 2019. Retrieved August 14, 2019.
  186. Martineau, Paris. "YouTube Has Kid Troubles Because Kids Are a Core Audience". Wired. Archived from the original on August 11, 2019. Retrieved August 14, 2019.
  187. Graham, Jefferson (June 22, 2019). "Why YouTube's kid issues are so serious". USA Today. Archived from the original on August 14, 2019. Retrieved August 14, 2019.
  188. Bergan, Mark; Shaw, Lucas (February 10, 2020). "YouTube's Secretive Top Kids Channel Expands Into Merchandise". Bloomberg News. Retrieved June 15, 2020.
  189. 189.0 189.1 Haskins, Caroline (March 19, 2019). "YouTubers Are Fighting Algorithms to Make Good Content for Kids". Vice. Archived from the original on August 14, 2019. Retrieved August 14, 2019.
  190. Palladino, Valentina (January 16, 2019). "YouTube updates policies to explicitly ban dangerous pranks, challenges". Ars Technica. Retrieved January 16, 2019.
  191. YouTube videos of children are plagued by sexual comments, The Verge, November 15, 2017
  192. 192.0 192.1 Mostrous, Alexi; Bridge, Mark; Gibbons, Katie (November 24, 2017). "YouTube adverts fund paedophile habits". The Times. Retrieved November 28, 2017.
  193. Tait, Amelia (April 24, 2016). "Why YouTube mums are taking their kids offline". New Statesman. Retrieved June 21, 2019.
  194. Todd Spangler, YouTube Faces Advertiser Boycott Over Videos With Kids That Attracted Sexual Predators, Variety, November 25, 2017
  195. 195.0 195.1 Bridge, Harry Shukman, Mark (December 10, 2018). "Paedophiles grooming children live on YouTube". The Times. ISSN 0140-0460. Retrieved February 18, 2019.{{cite news}}: CS1 maint: multiple names: authors list (link)
  196. 196.0 196.1 Bergen, Mark; de Vynck, Gerrit; Palmeri, Christopher (February 20, 2019). "Nestle, Disney Pull YouTube Ads, Joining Furor Over Child Videos". Bloomberg News. Retrieved February 20, 2019.
  197. Alexander, Julia (February 21, 2019). "YouTube terminates more than 400 channels following child exploitation controversy". The Verge. Retrieved February 21, 2019.
  198. Brodkin, Jon (February 21, 2019). "YouTube loses advertisers over "wormhole into pedophilia ring"". Ars Technica. Retrieved February 22, 2019.
  199. Haselton, Todd; Salinas, Sara (February 21, 2019). "As fallout over pedophilia content on YouTube continues, AT&T pulls all advertisements". CNBC. Retrieved February 21, 2019.
  200. Ingraham, Nathan (February 22, 2019). "YouTube is proactively blocking ads on videos prone to predatory comments". Engadget. Retrieved February 22, 2019.
  201. Fox, Chris (February 28, 2019). "YouTube bans comments on all videos of kids". Retrieved March 2, 2019.
  202. Alexander, Julia (February 28, 2019). "YouTube is disabling comments on almost all videos featuring children". The Verge. Retrieved February 28, 2019.
  203. Gerken, Tom (February 19, 2019). "YouTube backtracks after Pokemon 'child abuse' ban". BBC News. Retrieved February 20, 2019.
  204. Alexander, Julia (February 28, 2019). "Trolls are lying about child porn to try to get YouTube channels taken down". The Verge. Retrieved February 28, 2019.
  205. Fisher, Max; Taub, Amanda (June 3, 2019). "On YouTube's Digital Playground, an Open Gate for Pedophiles". The New York Times. Retrieved June 6, 2019.
  206. Ingraham, Nathan (June 6, 2019). "A Senator wants to stop YouTube from recommending videos featuring minors". Engadget. Retrieved June 6, 2019.
  207. Copeland, Rob (June 19, 2019). "YouTube, Under Fire, Considers Major Changes to Kids' Content". Wall Street Journal. Retrieved June 19, 2019.
  208. Arrington, Michael (March 31, 2008). "YouTube RickRolls Users". TechCrunch. AOL. Retrieved March 26, 2017.
  209. Wortham, Jenna (April 1, 2008). "YouTube 'Rickrolls' Everyone". Wired. Retrieved March 26, 2017.
  210. Bas van den Beld (April 1, 2009). "April fools: YouTube turns the world up-side-down". searchcowboys.com. Archived from the original on April 3, 2009. Retrieved April 2, 2010.
  211. Pichette, Patrick (March 31, 2010). "TEXTp saves YouTube bandwidth, money". Official YouTube Blog. Retrieved March 25, 2017.
  212. Richmond, Shane (April 1, 2011). "YouTube goes back to 1911 for April Fools' Day". The Daily Telegraph. Retrieved March 26, 2017.
  213. Carbone, Nick (April 1, 2012). "April Fools' Day 2012: The Best Pranks from Around the Web". Time. Retrieved March 26, 2017.
  214. Quan, Kristene (April 1, 2013). "WATCH: YouTube Announces It Will Shut Down". Time. Retrieved March 26, 2017.
  215. Murphy, Samantha (March 31, 2013). "YouTube Says It's Shutting Down in April Fools' Day Prank". Mashable. Retrieved November 8, 2019.
  216. Kleinman, Alexis (April 1, 2014). "YouTube Reveals Its Viral Secrets in April Fools' Day Video". Huffington Post. Retrieved April 1, 2014.
  217. Alba, Alejandro (April 1, 2015). "17 April Fools' pranks from tech brands, tech giants today". NY Daily News. Retrieved June 12, 2016.
  218. Sini, Rozina (April 1, 2016). "Snoopavision and other April Fools jokes going viral". BBC News. Retrieved April 1, 2016.

Further reading[edit]

External links[edit]