Tech Beat by Namecheap – 22 September 2023
AI-generated books can be cause for amusement when they’re mostly made up of gibberish, but unfortunately, AI ebook content isn’t always so benign. Recently, there has been an increase in AI books purported to be written by experts, giving out harmful advice and spreading dangerous ideas. Find out more in this week’s story, The AI-generated books that could harm you.
In tech news:
- Developers show concern about WordPress.com plugin listings outranking WordPress.org on Google. WordPress core developer John Blackbourn sparked a discussion about WordPress.com plugin listings outranking WordPress.org on Google search. He expressed concern about the impact on the open source project (to which many people contribute on a volunteer basis) and accused WordPress.com of prioritizing their own commercial service over the long-term health of the FOSS (free and open source) project. WP Tavern reports that others chimed in with similar experiences, though some still see WordPress.org pages ranked higher. The issue stems from the confusion between WordPress.com and WordPress.org, and many believe that renaming WordPress.com would help alleviate this confusion. Automattic CEO Matt Mullenweg defended the duplication, stating that it provides greater distribution for plugin authors. However, WP Tavern reports that the community has since been embroiled in many heated discussions over the matter. Many developers are worried about the SEO impact of cloning WordPress.org’s plugin directory for use on WordPress.com without backlinks to the original plugin, as well as confusion as to whether or not the plugins are free. These devs also expressed concerns about the confusion between WordPress.org and WordPress.com perpetuated by this practice.
- Microsoft’s AI expansion involves major water consumption. Artificial intelligence is growing rapidly, but it is causing a significant water usage problem. According to Gizmodo, Microsoft’s latest sustainability report shows a 30% increase in water usage between 2021 and 2022, primarily due to the company’s investment in AI and the need to cool data centers. To keep AI supercomputers running efficiently, data centers require large amounts of water, as overheating can lead to shutdowns. Microsoft’s data center in Des Moines, Iowa, for example, draws water from nearby rivers to cool its supercomputer. However, this has raised concerns as these water sources also provide drinking water for nearby communities. The local utility company has stated that future data center projects will only be considered if they can significantly reduce their water usage.
- MSN News calls dead NBA player ‘useless’ in AI-generated obituary. In another issue facing Microsoft, the company faced criticism after publishing an AI-generated obituary for NBA star Brandon Hunter, who passed away suddenly at the age of 42. The obituary described him as “useless,” which outraged fans. Search Engine Journal observes that this incident highlights the dangers of relying solely on AI for content generation, as it can lead to factual inaccuracies and problematic errors. It is crucial to ensure that AI-produced work is supervised by humans to avoid reputational damage and negative impacts on search rankings. Microsoft swiftly removed the offensive article from its website but has yet to issue an official apology.
- A boy saw 15+ doctors over 3 years for chronic pain until ChatGPT found the diagnosis. A mother named Courtney had been searching for answers for her son Alex’s chronic pain and other symptoms for three years. TODAY reports that after seeing 17 doctors with no diagnosis, Courtney turned to artificial intelligence for help. She meticulously researched and provided detailed MRI notes to ChatGPT, which led to the suggestion of tethered cord syndrome as a possible diagnosis. Amazingly, the AI chatbot’s answer turned out to be correct. Courtney’s experience highlights the frustration of seeing multiple specialists who only address their own areas of expertise without considering the bigger picture, as well as the potential for AI tools to improve — and even save — lives.
- Google adds AI content to its SEO rankings. The leading search engine used to prefer content written by humans for humans, but that’s all about to change, according to Decrypt. A new update to Google’s search rankings will recognize the important role AI is playing in content creation, so the question of whether the writer was human or not is less important. It’s reasonable that content is assessed on its benefit to the reader, but this means it may be more difficult to know if what we are reading was artificially generated. Google is also investing in AI in its AI chatbot Bard, its AI-generated news service, and in new experimental search features.
- Two companies working to produce recyclable lithium-ion batteries. Batteries used in modern electronic devices and cars come from non-renewable resources and are destructive to the environment. So it’s great to hear in this report from The Verge that BASF and Nanotech Energy have teamed up to produce recyclable lithium-ion batteries. BASF is a battery producer while Nanotech technology creates graphene-based energy products. American Battery Technology Company (ABTC) will make use of materials gathered by Nanotech and TODA Avanced Materials Inc. will prepare the materials that BASF will develop into batteries. BASF says this could decrease its carbon footprint by around 25%.
Previously in Tech Beat: Spotting AI-Generated Text
AI-powered text generators — such as ChatGPT, Bing, and Bard — have transformed the way we produce written content. While these tools are invaluable for brainstorming and refining the writing process, they also raise concerns about authenticity and originality. As AI-written content becomes more prevalent, the ability to distinguish between human and machine-authored content becomes increasingly important. Our article, “Spot the bot: uncovering AI-generated text,” investigates the issue of content authorship, especially in areas like news reporting and academic writing, as well as copyright issues. The evolving nature of AI text generators and the inherent biases in detection systems create additional challenges that companies are still trying to resolve.
Tip of the week: Don’t believe every e-book you read
Detecting if an e-book (or any book) contains bad advice or misinformation can sometimes be challenging. In addition to verifying the author’s credentials and considering the book reviews, here are several additional strategies you can use to assess the quality and reliability of any published work.
- Preview sample content. Many e-books allow you to preview a sample of their content. Take advantage of this to evaluate the writing style, depth of knowledge, and overall quality.
- Look for citations and sources. Look for proper citations and references within the e-book. Reliable advice should be backed by credible sources and evidence.
- Check for consistency. Ensure that the advice provided is consistent throughout the e-book. Inconsistencies or contradictions could be a sign of unreliable content.
- Evaluate grammar and writing quality. Poor grammar, spelling errors, and subpar writing quality can indicate a lack of professionalism and credibility.
Remember that while these strategies can help you assess the quality of an e-book, there’s no foolproof method to guarantee accuracy. Critical thinking and due diligence are essential when evaluating any advice or information, especially in the digital age where misinformation is prevalent.