(Reuters) – Alphabet Inc’s YouTube on Monday reinforced its guidelines on tackling fake or misleading election-related content on its platform as the United States gears up for the presidential election later this year.
YouTube will remove any content that has been “technically doctored” or manipulated or misleads the user about the voting process or makes false claims about a candidate, it said in a blog post.
Google and YouTube have been making changes to their platforms and moderating content as technology and social media companies come under fire for their role in spreading fake news, especially during elections.
While Google has said outright that it would remove election-related misleading content, Facebook Inc has announced limited changes to political ads on its platform.
Twitter Inc banned political ads in November, including those that reference a political candidate, party, election or legislation, in a push to ensure transparency.
Google and YouTube also prohibit certain kinds of misrepresentation in ads, such as misinformation about public voting procedures, political candidate eligibility based on age or birthplace or incorrect claims that a public figure has died.
(Reporting by Neha Malara in Bengaluru; Editing by Anil D’Silva)