Tether demonstrates QVAC, an AI assistant running entirely on a local laptop.
The model processes natural language commands and connects to external tools like Asana.
It uses the Model Context Protocol (MCP), an open standard developed by Anthropic.
Paolo Ardoino, CEO of Tether, presented a live proof-of-concept for QVAC, an artificial intelligence assistant executing complete inference on a mid-range laptop without relying on remote servers or cloud connections. The demonstration showed QVAC creating tasks and subtasks in Asana through natural language commands, processing requests in approximately one second and consuming minimal computational tokens.
QVAC’s architecture operates entirely on the GPU of the user’s local machine. Unlike assistants sending queries to centralized servers for processing instructions, QVAC keeps all data and operations inside the computer. The model generates responses without filtering sensitive information toward external infrastructure, protecting user privacy and eliminating dependence on cloud service providers.
Integration with external tools like Asana works through the Model Context Protocol (MCP), an open standard developed by Anthropic and launched in November 2024. MCP establishes a common format for AI models to communicate with applications, databases, and external services in a structured way
Before MCP, each integration between an AI model and a tool required custom development. The protocol standardizes the process through a client-server architecture where the model acts as client and tools expose their functions through MCP servers.
How the Model Context Protocol Enables Local AI to Access External Tools Without Compromising Privacy
MCP’s functioning resembles a universal connection bridge. The client — in QVAC’s case, the AI assistant running locally — connects to MCP servers exposing specific resources or capabilities. Each server can represent a different tool: Asana for task management, local file systems, databases, calendars, or any compatible application. The client requests actions through the standardized protocol, and the server executes the operation and returns results.
In Ardoino’s demonstration, QVAC processed a verbal instruction, interpreted user intent, connected to the Asana MCP server, and created a main task along with a related subtask. The entire process occurred without data leaving the user’s machine. The Asana MCP server can operate locally or in an environment controlled by the user, maintaining the project’s core privacy principle.
Ardoino confirmed the project will release as open source soon
The decision allows developers to audit the code, adapt the assistant to specific use cases, and build custom MCP servers for additional tools. Open access also promotes adoption of the decentralized AI paradigm, where users control their data and model execution happens on hardware they own or manage directly.
It shows the viability of running capable AI assistants on consumer hardware without sacrificing functionality. Large models have traditionally required server clusters or expensive graphics cards
QVAC demonstrates mid-range GPUs can handle inference with response times acceptable for daily use, reducing entry barriers for users prioritizing privacy and autonomy over cloud-based convenience.
Esta página puede contener contenido de terceros, que se proporciona únicamente con fines informativos (sin garantías ni declaraciones) y no debe considerarse como un respaldo por parte de Gate a las opiniones expresadas ni como asesoramiento financiero o profesional. Consulte el Descargo de responsabilidad para obtener más detalles.
Paolo Ardoino Demonstrates QVAC, an AI Assistant Running 100% Locally That Connects to Asana Without Using the Cloud - Crypto Economy
TL;DR
Paolo Ardoino, CEO of Tether, presented a live proof-of-concept for QVAC, an artificial intelligence assistant executing complete inference on a mid-range laptop without relying on remote servers or cloud connections. The demonstration showed QVAC creating tasks and subtasks in Asana through natural language commands, processing requests in approximately one second and consuming minimal computational tokens.
https://twitter.com/paoloardoino/status/2021893989352542332
QVAC’s architecture operates entirely on the GPU of the user’s local machine. Unlike assistants sending queries to centralized servers for processing instructions, QVAC keeps all data and operations inside the computer. The model generates responses without filtering sensitive information toward external infrastructure, protecting user privacy and eliminating dependence on cloud service providers.
Integration with external tools like Asana works through the Model Context Protocol (MCP), an open standard developed by Anthropic and launched in November 2024. MCP establishes a common format for AI models to communicate with applications, databases, and external services in a structured way
Before MCP, each integration between an AI model and a tool required custom development. The protocol standardizes the process through a client-server architecture where the model acts as client and tools expose their functions through MCP servers.
How the Model Context Protocol Enables Local AI to Access External Tools Without Compromising Privacy
MCP’s functioning resembles a universal connection bridge. The client — in QVAC’s case, the AI assistant running locally — connects to MCP servers exposing specific resources or capabilities. Each server can represent a different tool: Asana for task management, local file systems, databases, calendars, or any compatible application. The client requests actions through the standardized protocol, and the server executes the operation and returns results.

In Ardoino’s demonstration, QVAC processed a verbal instruction, interpreted user intent, connected to the Asana MCP server, and created a main task along with a related subtask. The entire process occurred without data leaving the user’s machine. The Asana MCP server can operate locally or in an environment controlled by the user, maintaining the project’s core privacy principle.
Ardoino confirmed the project will release as open source soon
The decision allows developers to audit the code, adapt the assistant to specific use cases, and build custom MCP servers for additional tools. Open access also promotes adoption of the decentralized AI paradigm, where users control their data and model execution happens on hardware they own or manage directly.
It shows the viability of running capable AI assistants on consumer hardware without sacrificing functionality. Large models have traditionally required server clusters or expensive graphics cards
QVAC demonstrates mid-range GPUs can handle inference with response times acceptable for daily use, reducing entry barriers for users prioritizing privacy and autonomy over cloud-based convenience.