PXL
A decentralised AI service for the detection of deepfake images and videos
Screenshots





Problem Statement
This is a deepfake detection service for images and videos, and a repository of datasets for training of AI models. The major components of the system are:An AI pipeline for deepfake detection of images and videos. For videos: Each video is scanned by an open-source AI model specialized in deepfake detection. A multimodal LLM (gpt-4o) receives the results and provides a summary of the assessment. (It does not analyze the video) For images: The file is scanned by an open-source multimodal LLM (LLava). A second LLM (gpt-4o) combines the result from step 1 with its own analysis to produces a final assessment.An IPFS repository of images and videos.
Solution
The Open-source AI models are deployed and used through decentralized compute marketplaces (CoopHive and Lilypad). Gpt-4o is accessed through the Galadriel blockchain and its trusted execution environment. Content (images and videos) are stored and accessed through IPFS and related services. Lighthouse.storage and web3.storage and their APIs and SDKs have been used for storing and accessing the files. The application has been built with typescript and nextjs. Some code for the interaction with the AI models and their deployment has been written in python.
Hackathon
HackFS 2024
2024
Prizes
- 🏆
Best Use of CoopHive for AI Applications
CoopHive
Contributors
- aifa
141 contributions