One of the biggest concerns legal professionals have in regard to the use of artificial intelligence is data privacy. For instance, many people do not realize that things that they input to ChatGPT, for example, may be recorded and used to further refine that AI's algorithm.
Cybersecurity and data privacy risks are especially concerning in highly-regulated fields like healthcare and law. In these areas, data security is not just an ethical imperative, but a fundamental requirement. These fields cannot use AI in the same way the general public can. Strict safeguards must be put in place.
2nd Chair is committed to a robust and open policy of data protection, in line with all applicable privacy laws. Here are some of the ways your data will be protected.
2nd Chair stores data using Amazon Web Services. AWS allows an unprecedented amount of control by end users over their data, combined with exceptional data security. Companies that are bound by HIPAA and GDPR trust AWS to store their data and confidential information securely.
For true data security, there must be multiple levels of protection involved. In addition to the security provided by AWS, 2nd Chair stores each user's information in its own data locker. This additional level of security ensures that no one else can access your stored data. David will never use your locker to answer questions about another locker, from you or any other user.
Data is only as secure as its end points. Just like your company has policies and protections in place to ensure data privacy and security, 2nd Chair limits who has access to your data. It can only be retrieved by the company's CTO, and only for support purposes. You can also request that 2nd Chair's CEO access your files for the purposes of providing you with training and support.
One of the major concerns with generative AI models is that they can be trained and refined using data from user inputs. 2nd Chair does not train or refine any models on client data. This includes the questions – called prompts – you ask to our David tool. So, neither our files, nor what you ask David, are recorded in ways that would risk your confidential information. Rest assured that your data will not be used to refine the tool for other users. There is no risk of your data being compromised by inclusion in David or other generative AI models used by 2nd Chair.
Every technology company utilizes outside vendors to provide auxiliary services required to deliver their product, and 2nd Chair is no exception. 2nd Chair is securing zero data retention policies with all vendors.
AI technologies are new, and their use in legal services is even moreso. The use of AI in the legal sphere can raise valid concerns and questions. However, 2nd Chair has designed David to be a system built with failsafes in place to help ensure that private and personal data remains private. The AI system used by the David product has been designed with the privacy issues and needs of lawyers and other legal users in mind. Some other products in the market are built by non-lawyers but for legal purposes. We think that’s a mistake. 2nd Chair is built by our team of experts including: lawyers, privacy professionals, paralegals, and consultants, all of who ensure our products meet the legal and ethical obligations of attorneys.