tars
TARS: A decentralized network of AI agents that capture, verify, and act on social/environmental issues using EigenLayer AVS. From Ray-Ban Meta glasses to DAO proposals, creating a trustless pipeline for social impact initiatives.
Problem Statement
TARS (Transformative Action Recognition System) revolutionizes how we address social and environmental issues by combining wearable tech, AI agents, and blockchain governance.The problem: While people encounter numerous social issues daily, there's no automated, trustworthy system to document, verify, and act on these observations. Current solutions are either centralized, manual, or lack verification.TARS solves this through a four-layer system:Verification Layer:Custom EigenLayer AVS (Actively Validated Service) ensures media authenticityDecentralized operator network verifies content from Ray-Ban Meta glassesCryptographic signatures prevent tampering and establish chain of custodyMultiple IPFS gateways for reliable, decentralized storageMedia Analysis Layer:AI agent processes verified media using Claude VisionExtracts comprehensive metadata (location, time, context)Aggregates local weather history and relevant newsGenerates detailed environmental/social impact analysisImpact Assessment Layer:Specialized agent evaluates issues using multi-factor scoringAutomatically generates DAO proposals for high-impact issuesSmart contract integration for transparent fund managementCommunity voting and proposal executionAgent Network Layer:Scalable network for multiple AI agents and wearable devicesAutomated coordination between verification and analysis agentsCross-agent data sharing and consensusExpandable framework for future agent integrationKey Features:Trustless media verificationAutomated context gatheringImpact-based prioritizationDecentralized governanceTransparent fund allocationReal-time social issue monitoringFuture Vision: TARS aims to create a global network of AI agents and wearable devices that continuously monitor and address social/environmental issues, making social impact initiatives more efficient, transparent, and actionable.
Solution
I built TARS as a bridge between real-world social initiatives and Web3 governance, starting with the challenge of verifying media from Ray-Ban Meta glasses. My core innovation is a custom AVS (Actively Validated Service) on EigenLayer that verifies image authenticity, paired with Claude Vision for AI analysis. This combination allows me to reliably process media from capture to verification while maintaining data integrity through a multi-gateway IPFS storage system using Pinata.The trickiest part was building the impact assessment system that powers the DAO. I developed a network of TypeScript agents using the ELIZA framework to gather context from multiple sources—weather data, local news, and location information—to score social initiatives. These agents feed into Arbitrum-based DAO smart contracts, which handle community voting and fund allocation.I used the Exifr library to extract raw image metadata, EigenLayer to build the AVS, and Pinata IPFS to store all files and media for verification. The system integrates the Claude Vision API, Dynamic Wallet for human accessibility, and Arbitrum Sepolia Testnet for the DAO smart contracts and voting. For the frontend, I used React JS, and last but not least, Ray-Ban Meta Smart Glasses to capture media.The end result is a system that successfully bridges the gap between real-world social initiatives and decentralized governance while maintaining data integrity and user trust throughout the entire pipeline.