SARUSLLM

Unleash the power of Generative AI
while keeping private data safe

SarusLLM is intended for businesses and developers who are interested in leveraging the full power of open source LLMs while ensuring no sensitive information is accessed, embarked in the model weights and will be revealed.

The Issue

LLMs are excellent at memorizing the information they’re trained on, and are likely to regurgitate sensitive information when prompted, posing serious privacy risks.

Here are two chatbot examples of an LLM (Mistral7B) fine tuned on sensitive medical information.
The first chatbot is not fine tuned without differential privacy and immediately leaks information about the patient:
The second chatbot is fine tuned with DP, and keeps them protected:

Sarus Answer

Prevent LLMs from leaking private information thanks to Differential Privacy.

Check out our demo notebook!

Want to test?
Leave your email!

Name*
Email*
Message*
Thank you! Your message has been delivered. We will get back to you shortly.
Oops! Something went wrong while submitting the form.

We will share a demo instance where you can fine-tune most open source LLMs (mistral, llama, gpt..) with or without privacy guarantees in just a few lines of code, and see for yourself that memorization can be solved!
In your infrastructure, you will just have to install the Sarus app and use Sarus python SDK to launch FT jobs with privacy guarantees. Sarus leverages the available GPU when needed. Then your safe models can be put into prod just as usual!

SarusLLM lets data practitioners work with LLMs in a privacy-safe way
based on two main capabilities:

Train LLMs in clean rooms

Data scientists explore, preprocess data and feed it to LLMs without directly seeing the data. Only high-quality synthetic data and differentially-private stats can be retrieved from the clean room. To do so, data scientists use their usual AI and GenAI tools wrapped in the Sarus python SDK.

Prevent LLMs
from leaking  private information

Differential Privacy guarantees can be included in the LLM fine-tuning process itself, through just a fit parameter. This ensures that no personal data is embedded in the fine-tuned model, thanks to automated Differentially-Private Stochastic Gradient Descent (DP-SGD).

Why SarusLLM?

Maximize GenAI ROI while preserving data privacy

LLM experimentations start from day 1, data teams avoid paperwork and focus on creating value for the enterprise. All internal and external datasets are available for LLM projects thanks to Sarus' no data access approach.

Reach the highest level of privacy and security for your LLM workflows 

Enjoy Sarus zero-trust approach: no data is ever directly accessed by data scientists, and model tuning is protected by the highest standard of privacy, Differential Privacy. The company is protected from uncontrolled data spillage and meeting regulations' highest expectations.

Experiment LLM fine-tuning with Sarus SDK

Launch fine-tuning jobs in just a few lines of code for any open source LLM family (llama, mistral, gpt2...), any data, any infrastructure (Azure, GCP, Databricks, on-premises...). Sarus orchestrates GPU resources for data scientists, they don’t have to worry about it.

Use case examples

Build a synthetic patient records generator without compromising patient privacy

Healthcare
Improve embedding model for a retrieval augmented medical assistant
Healthcare
Build a customer service chatbot powered by a model fine-tuned on all enterprise internal documents and customer service historical discussions
Banking
Insurance
Retail
Build an AI model for fraud detection on unstructured data with no customer data breach risk
Insurance
Banking
Generate privacy-preserving synthetic data for any data source for debugging and testing
Banking
Fine-tune a LLM-based lookalike model on your private data and the partner's one without either of you sharing underlying data
Marketing

Put the power of LLMs into action without any privacy risk

Ready to go? Book a meeting with a Sarus expert.
Note that we are experimenting with LLMs as everyone and as a consequence, we're open to co-building!
Should you have any custom needs around privacy and LLMs, let us know! We'd be happy to have you as a designer partner.

Subscribe to our newsletter

You're on the list! Thank you for signing up.
Oops! Something went wrong while submitting the form.
128 rue La Boétie
75008 Paris — France
Resources
Blog
©2023 Sarus Technologies.
All rights reserved.