Chui said companies can license use of an existing AI platform, so they can monitor what employees say to a chatbot and make sure that the information shared is protected.
"If you're a corporation, you don't want your employees prompting a publicly available chatbot with confidential information," Chui said. "So, you could put technical means in place, where you can license the software and have an enforceable legal agreement about where your data goes or doesn't go." Licensing use of software comes with additional checks and balances, Chui said. Protection of confidential information, regulation of where the information gets stored, and guidelines for how employees can use the software — all are standard procedure when companies license software, AI or not.
"If you have an agreement, you can audit the software, so you can see if they're protecting the data in the ways that you want it to be protected," Chui said. Most companies that store information with cloud-based software already do this, Chui said, so getting ahead and offering employees an AI platform that's company-sanctioned means a business is already in-line with existing industry practices.One security option for companies is to develop their own GPT, or hire companies that create this technology to make a custom version, says Sameer Penakalapati, chief executive officer at Ceipal, an AI-driven talent acquisition platform.
Business Business Latest News, Business Business Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
ChatGPT and Generative AI in Media and AdvertisingChatGPT and Generative AI in Media and Advertising: With Use Cases Set, the Battle for Hearts and Minds Begins
Source: BusinessInsider - 🏆 729. / 51 Read more »
ChatGPT and Generative AI in Financial Services: How to PrepareInsider tells the global tech, finance, markets, media, healthcare, and strategy stories you want to know.
Source: BusinessInsider - 🏆 729. / 51 Read more »
Source: Gizmodo - 🏆 556. / 51 Read more »