From Google’s perspective, Bard looks like a rush to compete with ChatGPT, and some Googlers agree. A new report from Bloomberg polled 18 current and former employees, and is full of condemning comments and concerns about AI ethics teams being “disenfranchised and demoralized”so that Google could kick Bard out the door.
According to the report, Google employees were asked to test a pre-release version of Bard for their feedback, which was largely ignored so that Bard could launch faster. Internal discussions viewed by Bloomberg called Bard “worthy of a jiggle”and a “pathological liar”. When asked how to land the plane, he gave incorrect instructions that would have led to the crash. One staff member asked for diving instructions and received a response that he said would “probably result in serious injury or death”. One of the employees described Bard’s problems in a February post titled “Bard is worse than useless: please don’t launch.”Bard launched in March.
You could probably say the same about the AI competitor Google is chasing, OpenAI’s ChatGPT. Both can give biased or false information and hallucinate wrong answers. Google is far behind ChatGPT, and the company is in a panic over ChatGPT’s ability to answer questions that people might otherwise type into Google searches. The creator of ChatGPT, OpenAI, has been criticized for its lax approach to AI safety and ethics. Now Google is in a difficult situation. If a company’s only concern is to calm the stock market and catch up with ChatGPT, it probably won’t be able to do so if it slows down ethics issues.
Meredith Whittaker, former Google manager and president of the Signal Foundation, told Bloomberg that “the ethics of AI have taken a backseat”at Google, and says that “if ethics aren’t prioritized over profit and growth, they won’t end up working.”In recent years, several of Google’s AI ethics leaders have been fired or left the company. Bloomberg reports that Google’s AI ethics reviews are “almost entirely voluntary”today.
While you could do something at Google to try and slow down releases due to ethical issues, it probably won’t be good for your career. The report says: “One of the former employees said that he asked to work on fairness in machine learning, and he was regularly dissuaded – to the point that it affected their assessment of performance. Managers protested that this interfered with their “real work”. “