DETECT
Deception Tracking Through Eye Cues Technology
Real-time, web-based deception detection using eye-tracking and behavioral analysis
Clone & Setup for Newbies
Get DETECT running on your machine in minutes. Just follow the steps below for your operating system.
macOS & Linux Setup
Step 1: Clone the repository
git clone https://github.com/yourusername/DETECT.js.git
cd DETECT.js
Step 2: Install Bun (fast JavaScript runtime)
curl -fsSL https://bun.sh/install | bash
Step 3: Install dependencies
bun install
Step 4: Start the development server
bun run dev
Your site will be running at http://localhost:3000 🚀
Windows Setup
Step 1: Clone the repository
git clone https://github.com/yourusername/DETECT.js.git
cd DETECT.js
Step 2: Install Bun for Windows
Download and run the installer from bun.sh or use PowerShell:
powershell -c "irm bun.sh/install.ps1 | iex"
Step 3: Install dependencies
bun install
Step 4: Start the development server
bun run dev
Your site will be running at http://localhost:3000 🚀
Troubleshooting: If you have issues, make sure you have Node.js or Git Bash installed.
Overview
DETECT is a real-time, web-based system for analyzing potential deceptive behavior using eye-tracking and behavioral signals. It provides a non-invasive alternative to traditional polygraph-style methods by leveraging computer vision and deterministic signal processing techniques.
The system captures gaze patterns through a webcam or uploaded video, processes eye movement data in real time, and generates a deception likelihood score based on behavioral anomalies.
Key Features
⚡ Real-Time Gaze Analysis
Processes eye movement data live for immediate feedback with minimal latency.
📹 Non-Invasive Eye Tracking
Uses MediaPipe FaceMesh for facial landmark and iris detection without external hardware.
🔬 Deterministic Detection Engine
Computes deception indicators using gaze dispersion (variance) and directional acceleration.
📊 Personalized Scoring
Min-Max normalization adapts scoring across sessions for user-specific baselines.
🛡️ Robust Signal Processing
Applies Savitzky-Golay filtering and affine transformations for stability under head movement.
🌐 Web-Based Interface
Supports live webcam input and video uploads with real-time visualization and graphs.
Technology Stack
| Layer | Technology | Description |
|---|---|---|
| Frontend | Astro |
High-performance web framework with minimal JavaScript overhead |
| Eye Tracking | MediaPipe FaceMesh |
Real-time facial landmark and iris tracking |
| Backend | Go (Gochi) |
Low-latency, concurrent processing engine |
| Database | PostgreSQL |
Stores gaze metrics, timestamps, and session data |
| Runtime | Bun |
Fast JavaScript runtime for frontend tooling |
| Communication | WebSockets |
Real-time streaming between frontend and backend |
| Deployment | Docker + DigitalOcean |
Containerized deployment environment |
System Architecture
Video Input (Webcam/Upload)
↓
Frontend (Astro)
↓
MediaPipe FaceMesh Eye Tracking
↓
Raw Gaze Coordinates
↓
WebSocket Stream
↓
Backend: Go Processing Engine
↓
Deterministic Analysis
(Gaze Dispersion, Directional Acceleration)
↓
Deception Score Calculation
↓
PostgreSQL Storage + Real-Time Visualization
↓
User Dashboard
Data Flow
- Video input is captured via webcam or uploaded file.
- Astro frontend processes frames using MediaPipe FaceMesh.
- Eye landmarks are converted into gaze coordinate data.
- Data is streamed to backend via WebSockets.
- Go backend applies filtering and statistical analysis.
- Deception score is computed using deterministic rules.
- Results are stored and visualized in real time.
Installation & Deployment
The system is designed for containerized deployment using Docker.
Core Tools
- Docker - Containerized deployment
- Bun - JavaScript runtime for frontend tooling
- Conda - Environment management for ML/MediaPipe components
- GitHub / JIRA - Version control and Agile workflow tracking
Deployment Environment
- Hosted on DigitalOcean VPS
- Microservice-based Docker architecture
- WebSocket-based real-time communication layer
Team
- Adam Bawatneh — Project Manager
- Brian Conn — Scrum Master & Frontend Developer
- Jeffrey Chang — Frontend Developer
- Muzamil Shamsi — Database & Frontend Developer
- Soham Ganguly — ML Engineer & Lead Full Stack Developer
- John Economou — Research & Backend Developer
Project Status
The system is currently a functionally complete and deployable solution developed using Agile methodology.
Epic Status
| Epic | Status |
|---|---|
| Project Management and Coordination | Done |
| Research and Data Collection | Done |
| Eye Tracking Implementation | Done |
| Anomaly Detection Algorithm | In Progress |
| Web Application Development | In Progress |
| Testing and Refinement | To Do |
| IRB Approval and Human Studies | To Do |
Future Work
- Integration of advanced eye cues (pupil dilation, blink rate)
- Addition of speech and audio behavioral analysis
- IRB-approved validation studies with human participants
- Improved ML hybridization with statistical detection engine
- Enhanced personalization through long-term behavioral modeling