Apple has been very concerned when it comes to the exposure of its confidential data, which resulted in several actions taken by the company, including the one against the FBI.
ChatGPT has become extremely popular among employees at various organizations as it can be used for everything from writing a simple email to developing sophisticated software.
Recent reports suggest that Apple has restricted some of its employees from using ChatGPT or any other external Artificial Intelligent tools. It was stated that Apple was concerned about the leakage of its confidential data to the developers of the Artificial Intelligence Bot.
In addition to this, Apple has also ordered its employees not to use Microsoft-owned GitHub’s Copilot, which can be used for writing codes for software.
Apple’s own Artificial Intelligence bot
Apple has been working on a large language model and developing its own Artificial Intelligence bot, which will be able to answer questions and perform tasks like a human.
They have already acquired several artificial intelligence-based startups since John Giannandrea (Former Google employee) became Apple’s Senior Vice President.
It can be underlined that ChatGPT temporarily took its application offline in March 2023, when some users were able to see the titles of other users’ chat history.
In accordance with this, an AI spokeswoman stated that users must have an option to turn their chat history off, to which OpenAI responded that it was used to train their AI model.
Companies like JPMorgan Chase, Verizon, and countries like Italy have already banned the use of ChatGPT based on data security concerns. Adding to this, Apple has now started to restrict the use of ChatGPT.
Apple launched Siri in 2011 and had its competitor in Artificial Intelligence as, Alexa, when Amazon launched Alexa in 2014. However, Alexa has won the AI race, with Apple reaching many users of Alexa.
Source: https://cybersecuritynews.com/apple-blocks-chatgpt/