Search
Software and Technology

Tinexta Visura reshapes legal search with Elasticsearch and generative AI, hosted on Google Cloud

Download PDF

With Elastic, Tinexta Visura clients can reduce legal research and drafting time by one hour to two full days depending on the complexity of the case.

Reduces generative AI costs

Tinexta Visura can filter more than 4.8M legal documents to reduce token usage costs with Elastic.

Hybrid search methods in Elastic give Tinexta Visura greater precision and context of legal queries.

Tinexta Visura is a pioneer in the use of generative AI in the Italian legal field. The Italian technology leader, known for its expertise in digital trust, cybersecurity, and professional innovation services, recently unveiled a groundbreaking search platform that streamlines workflows and boosts productivity for law firms and in-house legal teams.

These organizations need fast, reliable access to vast amounts of legal information, whether it's preparing for court or conducting legal research. Tinexta Visura's new platform Lextel AI, transforms how legal professionals interact with legal texts and sources, taking full advantage of Elasticsearch for text search, semantic search, and retrieval-augmented generation (RAG).

Andrea Vingolo, General Manager at Tinexta Visura, says that they chose Elasticsearch as the foundation for Lextel not just for its ability to handle complex queries but also for features that help draft summaries, legal opinions, and memos. By grounding the generative output in citable documents, Lextel AI provides legal professionals with the transparency they need to use the system confidently.

Cost efficiency was also a major consideration. Lextel AI uses Elasticsearch, running on Elastic Cloud and Google Cloud, to manage a repository of 4.8 million documents, each averaging 15 pages. Giancarlo Facoetti, Head of AI Strategy for Tinexta Innovation Hub, the organization's research arm, says, "With Elasticsearch, we can filter that content before it even reaches the generative layer, dramatically reducing token usage costs."

Facoetti, who led the development of Lextel AI, highlights two key innovations enabled by Elasticsearch on Google Cloud. First, it enables hybrid search by combining BM25 keyword-based search and vector-based semantic retrieval, allowing legal professionals to enter detailed and context-aware queries. Second, it supports generative AI, powered by Google Gemini, to produce structured summaries and legal opinions that are fully traceable for maximum transparency."

This combination of flexible retrieval plus generative augmentation really transforms the legal workflow," says Facoetti.

For legal professionals used to spending long days, and sometimes nights, researching project materials, Lextel AI is a godsend. The platform saves lawyers anywhere from one hour to two full days of work, depending on the complexity of the task. For example, when an attorney needs to review hundreds of court rulings to support a specific legal argument, Lextel AI retrieves the most relevant cases, highlights the most important parts of the source material, and compiles a draft legal opinion all in a fraction of the time it would take with a traditional search platform.

"Elasticsearch massively reduces the complexity around semantic search. You don’t have to stitch together multiple components, which translates to fewer systems to monitor, maintain, and troubleshoot—and fewer headaches for everyone from our innovation team to our client end users."

– Giancarlo Facoetti, Head of AI Strategy, Tinexta Innovation Hub

Building success on a platform of trust

Vingolo emphasized the importance of partnership. "I've worked with several major enterprise platforms, and I know that choosing a technology isn't just about the product itself. It's also about the service, the people behind it, and the trust they build with your team. Elastic gave us that trust."

During one key moment in the Lextel AI development stage, the team faced a challenge uploading customer embeddings into Elasticsearch. A quick call to Elastic's support team led to a quick resolution of the issue.

Elastic's roadmap is equally reassuring. As vector databases become more essential within modern search platforms, Elasticsearch can compute embeddings natively for increased convenience. This enables Tinexta Visura's innovation team to consolidate everything into a single system without any additional infrastructure. Elastic's Ranking Evaluation API, which allows you to evaluate the quality of ranked search results over a set of typical search queries, also stands out as a potentially valuable feature thanks to its adaptability and support for continuous tuning.

Facoetti highlights the breadth and integration of features in Elasticsearch as key differentiators. Traditional vector databases might let you store and search embeddings, but with Elasticsearch, organizations can manage full-text content, metadata, embeddings, and ranking, all in one platform.

"Elasticsearch massively reduces the complexity around semantic search. You don't have to stitch together multiple components, which translates to fewer systems to monitor, maintain, and troubleshoot—and fewer headaches for everyone from our innovation team to our client end users," he says.

"I've worked with several major enterprise platforms, and I know that choosing a technology isn't just about the product itself. It's also about the service, the people behind it, and the trust they build with your team. Elastic on Google Cloud gave us that trust."

– Andrea Vingolo, General Manager, Tinexta Visura

Solutions