Home ai Unlock the Power of AI with Vectara Portal: A No-Code Solution for...

Unlock the Power of AI with Vectara Portal: A No-Code Solution for Building AI Applications


Vectara, a company based in Palo Alto, California, has recently introduced Vectara Portal, an open-source environment that simplifies the development of generative AI applications. This new tool sets itself apart from other commercial offerings by providing easy access and usability. Even users with limited technical skills can create search, summarization, or chat apps that are grounded in their own datasets, without needing to write any code.

The potential of Vectara Portal is vast, as it allows non-developers to utilize AI in various use cases within their organizations, such as policy or invoice search. However, it is important to note that the tool is still new, and its performance is currently being tested by a small number of beta users.

Ofer Mendelevitch, Vectara’s head of developer relations, believes that the tool will see massive adoption by non-developers due to its simplicity and the fact that it is powered by Vectara’s proprietary retrieval augmented generation (RAG) platform. This increased adoption is expected to drive traction for the company’s enterprise-grade offerings.

So, how does Vectara Portal actually work? It is available as both a hosted app and an open-source offering under the Apache 2.0 license. Users create portals, which are custom applications, and share them with their intended audience. To get started, users need to create a Portal account using their Vectara account credentials and set up their profile with their Vectara ID, API Key, and OAuth client ID. After that, they can create a portal by providing basic details like the name, description, and intended purpose of the app (semantic search, summarization, or conversational chat assistant). The created portal will then be added to the Portal management page.

Once a portal is created, users can add documents to customize the app according to their data. These documents are indexed by Vectara’s RAG-as-a-service platform, which powers the backend of the portal. This indexing allows for accurate and reliable answers. Users can also select different language models (LLMs) provided by Vectara, such as their own Mockingbird LLM or those from OpenAI, to process the queries.

When a user asks a question on the portal, Vectara’s RAG API runs the query against the associated corpus, which is specific to the user’s data. The API retrieves the most relevant parts of the documents and feeds them into the chosen LLM. This process ensures that the user receives the most accurate answer to their question.

Vectara aims to attract more enterprise customers with its no-code offering, both as a hosted app and an open-source product. By allowing users to build powerful generative AI apps without coding, the company hopes to increase sign-ups and generate interest in their main RAG-as-a-service offering. This, in turn, will lead to better conversion rates.

With over $50 million in funding and approximately 50 production customers, including notable names like Obeikan Group, Juniper Networks, Sonosim, and Qumulo, Vectara is poised to make a significant impact in the AI development space. The introduction of Vectara Portal will likely further solidify their position in the industry and attract new users from all skill levels.

Exit mobile version