TLDR:
Perplexity, an AI search tool, is facing criticism for relying on AI-generated blog posts and LinkedIn posts that contain inaccurate and out-of-date information. This has led to misinformation being spread through the platform. The company claims to cite reliable sources, but a study found that it frequently references AI-generated content. This raises concerns about the quality and accuracy of the information provided by Perplexity.
Key Elements:
- Perplexity is criticized for citing AI-generated blog posts with inaccurate information.
- The company has come under fire for allegedly plagiarizing journalistic work from various news outlets.
- The search engine’s reliance on AI-generated sources raises concerns about the spread of misinformation and the potential for biases in the data.
- Perplexity claims to be improving its search engine by refining processes that identify relevant and high-quality sources, but the study shows that AI-generated content is still prevalent.
- Experts warn that using low-quality web sources can lead to the promotion of disinformation and biases in AI models.
Overall, Perplexity’s reliance on AI-generated content and its handling of authoritative sources raises questions about the quality and accuracy of the information provided by the platform.