Knostic brings access control to LLMs

Security startup Knostic is the latest company face different challenges that organizations face when adopting generative AI tools. Knostic emerged stealthily with $3.3 million in pre-seed funding to introduce “necessary” access controls for large language models.

Companies on their AI transformation journey are spreading AI capabilities throughout their workflow and processes to increase productivity, reduce costs and increase efficiency, says Gadi Evron, co-founder and CEO of Knostic. Companies are adopting large language models to build ChatGPT-like enterprise search systems based on their own data sources or by enabling built-in features in the applications and platforms they are already using. Data privacy is one of the biggest barriers to AI adoption, Evron says, noting that unchecked AI potentially exposes the organization to greater risks, primarily by exposing information to the wrong person.

“How can we curate personalized information and actually give it value? Respond with what you need to know instead of just saying things,” says Evron.

Access control for LLMs is necessary

With Knostic, employees can access everything they need and receive answers that align with what they need to do their jobs.

For example, an organization may have a system that can answer questions such as features expected in the next product release, latest sales numbers and revenue data, bonus structure, due diligence results in a scenario of mergers and acquisitions or the status of an infrastructure project. But not everyone should have the same answer to every question. While the CFO and CTO need to know quarterly sales revenue, the marketing intern probably doesn’t, notes Evron. Knostic’s access control engine evaluates whether the response is appropriate for the questioner’s role, and if not, responds with a “Sorry, this is confidential information,” Evron says. Or, instead of simply saying no, the system may respond that, even if the answer is confidential, the marketing campaigns the intern worked on increased sales during the quarter. This is where customization and curation come into play.

One thing Evron points out is that access control is binary: either the person can have access or they can’t. Knostic’s focus on “need to know” makes it possible to provide some information even when the answer is no. “When we say no, we’re not making it easier for the business,” says Evron, noting that providing information in a different format or with related context helps the business user more than simply being told no.

“Once you understand what you are allowed to know, you can solve the DLP [Data Loss Prevention] and IAM[IdentityAccessManagement”saysEvron[IdentityAccessManagement”Evronsays[IdentityAccessManagement”affermaEvron[IdentityAccessManagement”Evronsays

What “need to know” looks like.

When thinking about access control, organizations need to consider factors such as whether the system is internal or public-facing, whether the data used to generate responses is sensitive or non-sensitive, and the role of the person asking the questions, says Sounil Yu.

There has been much discussion about how organizations should build guardrails into AI systems to prevent abuse or provide responses that could cause harm. However, guardrails tend to be one-size-fits-all and don’t take into account a person’s specific circumstances, Yu says. She considers how many publicly available chatbots would not provide medical information because they are not a medical professional and should not be used for diagnostics. But if it’s a doctor trying to access information as part of an investigation, that particular restriction isn’t helpful. Access control, unlike guardrails, takes into account factors such as time, data sensitivity and the person’s role to determine how to shape responses.

For example, the company may have a customer service chatbot that helps resolve issues and assists in resolving common issues. That chatbot will have access to the same internal knowledge base articles that the customer service representative would have. But what if there is an article about a product that is not yet available on the market (e.g. the latest iPhone)? The customer service representative needs this information to be ready when the product becomes available and may need it in advance for training purposes. But there would be a lot of problems for the company if the customer got to know the product details from the chatbot before the launch.

Instead of creating two systems, one for internal use and one public-facing, the company can conceivably use Knostic’s approach to provide different answers to the customer and the customer service representative.

Company Details

Evron and Yu have deep industry experience. Evron was the founder of deception startup Cymmetria and previously held roles at Citibank and PwC. Yu is the former chief security scientist at Bank of America and former CISO and head of research at JupiterOne.

Knostic, founded in 2023, has raised $3.3 million in pre-seed funding from Sheild Capital, Pitango First, DNS Ventures, Seedcamp and several angel investors. Retired Admiral Mike Rogers, former head of the National Security Agency, said in a statement that the startup will “unlock LLM for enterprises.”

Knostic has customers in a wide range of industries, including financial services and retail.

The company is also one of the top three finalists for the RSA Conference Launch Pad 2024. At the Launch Pad, founders of new companies (established for two years or less) get to pitch ideas and products “on the verge of becoming the next big thing ” to a group of venture capitalists.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *