Chat with Groq: A Showcase of WeWeb-native Conversational Interface

Hey, WeWeb Community!

I’m excited to share a showcase project that highlights the possibilities of integrating Groq models through WeWeb components with conversational AI models, as demonstrated in the interface. This example showcases WeWeb’s visual development platform, enabling you to build a customized chat interface that interacts with multiple models through a REST API.

Try it out (no account needed):

Key Features:

  1. WeWeb-Powered Interface: The entire chat interface is built using WeWeb’s visual development platform, demonstrating its capabilities in building complex user interfaces.

  2. Groq Integration: Groq interacts with Large Language Models (LLMs) through a REST API, showcasing the power of using WeWeb with external services.

  3. Model Switching: Easily switch between models during the same conversation while maintaining the context of past messages. Supported models include llama3-70b-8192, llama3-8b-8192, mixtral-8x7b-32768, and gemma-7b-it.

  4. Local-Only Experience: Everything is local, and nothing is saved in a database.

  5. No Account Required: You don’t need to create an account or provide an API key (although there are some limitations with the plugin).

Please note that streaming responses are not implemented in this example, as the responses are already fast. Additionally, conversation history is not included, as this project is primarily a demonstration.

While there are some bugs and optimizations to be made, I plan to share the raw file code and the project code. Once the marketplace is available, I intend to share it for free.

Unfortunately, due to the limitations of my current plan and the lack of a staging domain, this UI will likely be replaced by another one after May 24th.

Feel free to try it out yourself:

Let me know if you have any other questions or feedback!


Thanks for sharing!
Great work :partying_face:

1 Like

Thanks !
It’s of course totally free !
I wish i could export it properly and simply to vercel/else, but for some reason didn’t succeed to do it through the raw files, is there any changes internally compared to few month ago where i didn’t have any issue with it ?

1 Like Here a video i made !

1 Like

Now you can have chat conversation history !
(It’s only saved locally on your browser !)

No need to have a API or create an account !

It’s up again, just without local history, need to find the last export if i did prior !