福利加碼,Gate 廣場明星帶單交易員二期招募開啟!
入駐發帖 · 瓜分 $20,000 月度獎池 & 千萬級流量扶持!
如何參與:
1️⃣ 報名成為跟單交易員:https://www.gate.com/copytrading/lead-trader-registration/futures
2️⃣ 報名活動:https://www.gate.com/questionnaire/7355
3️⃣ 入駐 Gate 廣場,持續發布交易相關原創內容
豐厚獎勵等你拿:
首帖福利:首發優質內容即得 $30 跟單体验金
雙周內容激勵:每雙周瓜分 $500U 內容獎池
排行榜獎勵:Top 10 交易員額外瓜分 $20,000 登榜獎池
流量扶持:精選帖推流、首頁推薦、周度明星交易員曝光
活動時間:2026 年 2 月 12 日 18:00 – 2 月 24 日 24:00(UTC+8)
詳情:https://www.gate.com/announcements/article/49849
Paolo Ardoino Demonstrates QVAC, an AI Assistant Running 100% Locally That Connects to Asana Without Using the Cloud - Crypto Economy
TL;DR
Paolo Ardoino, CEO of Tether, presented a live proof-of-concept for QVAC, an artificial intelligence assistant executing complete inference on a mid-range laptop without relying on remote servers or cloud connections. The demonstration showed QVAC creating tasks and subtasks in Asana through natural language commands, processing requests in approximately one second and consuming minimal computational tokens.
https://twitter.com/paoloardoino/status/2021893989352542332
QVAC’s architecture operates entirely on the GPU of the user’s local machine. Unlike assistants sending queries to centralized servers for processing instructions, QVAC keeps all data and operations inside the computer. The model generates responses without filtering sensitive information toward external infrastructure, protecting user privacy and eliminating dependence on cloud service providers.
Integration with external tools like Asana works through the Model Context Protocol (MCP), an open standard developed by Anthropic and launched in November 2024. MCP establishes a common format for AI models to communicate with applications, databases, and external services in a structured way
Before MCP, each integration between an AI model and a tool required custom development. The protocol standardizes the process through a client-server architecture where the model acts as client and tools expose their functions through MCP servers.
How the Model Context Protocol Enables Local AI to Access External Tools Without Compromising Privacy
MCP’s functioning resembles a universal connection bridge. The client — in QVAC’s case, the AI assistant running locally — connects to MCP servers exposing specific resources or capabilities. Each server can represent a different tool: Asana for task management, local file systems, databases, calendars, or any compatible application. The client requests actions through the standardized protocol, and the server executes the operation and returns results.

In Ardoino’s demonstration, QVAC processed a verbal instruction, interpreted user intent, connected to the Asana MCP server, and created a main task along with a related subtask. The entire process occurred without data leaving the user’s machine. The Asana MCP server can operate locally or in an environment controlled by the user, maintaining the project’s core privacy principle.
Ardoino confirmed the project will release as open source soon
The decision allows developers to audit the code, adapt the assistant to specific use cases, and build custom MCP servers for additional tools. Open access also promotes adoption of the decentralized AI paradigm, where users control their data and model execution happens on hardware they own or manage directly.
It shows the viability of running capable AI assistants on consumer hardware without sacrificing functionality. Large models have traditionally required server clusters or expensive graphics cards
QVAC demonstrates mid-range GPUs can handle inference with response times acceptable for daily use, reducing entry barriers for users prioritizing privacy and autonomy over cloud-based convenience.