Intelligence Protocol  ·  6 Industries
prop o gate

Other systems create messages.
propOgate creates intelligence.

Every prior system has a pipeline between the human decision and the moment structured data exists — speak, receive, interpret, classify, route, log. Six steps. Six failure points.

One gesture simultaneously classifies, routes, and logs — producing AI-ready intelligence the instant a human acts. No pipeline. No lag. No translation layer.

0 Pipeline steps
1 Gesture required
3 Simultaneous outputs
6 Industries
<0.3ms Burst window

Between deciding and knowing, every system fails.

Every prior communication system requires a human pipeline to convert a field decision into usable data. Each step adds lag. Each step can fail. By the time an AI or a commander has structured data, the moment has passed.

Every system built before this
Human decides
Communicate
Receive
Interpret
Classify
Route
Log
AI / Command acts

6 failure points. Multiple handoffs. Minutes of lag — or never.

propOgate
Human decides
Classified + Routed + Logged — simultaneously, in the gesture
AI / Command acts

0 failure points. 0 handoffs. <0.3 seconds.

One decision. One gesture. Everything else is the same action.

The classification tree is simultaneously the taxonomy, the routing ruleset, and the analytics schema. Selecting a node does all three — inseparably, in under one second.

Step 01

Navigate

A directional gesture moves through a hierarchical classification tree. Touch, eye tracking, brain-computer interface, constrained voice, physical controller — any modality, same tree, same result.

Input-agnostic · Any modality · Same tree
Step 02

Select

Before the payload fires, the system evaluates the circular variance of the input trajectory — filtering environmental tremor, vehicle motion, and accidental contact. A kinematically verified commit simultaneously classifies, routes, and logs. Not sequentially. The same action.

Kinematic verified · Classify = Route = Log · One action
Step 03

Transmit

A 400-byte AES-256-GCM encrypted burst leaves the device in under 0.3ms. Semantically opaque — a sequence of tree node indices meaningless without the tree configuration.

400 bytes · <0.3ms · Post-quantum viable
Step 04

Intelligence exists

The receiving end has structured, classified, routed, logged data — not voice requiring transcription, not free text requiring interpretation. The AI acts immediately. No preprocessing.

Structured · Routed · Logged · AI-ready

The protocol underneath the gesture.

The patent does not describe a faster interface. It describes four infrastructure primitives — each solving a failure mode that breaks every prior system in hostile environments.

01 · Kinematic Verification

The gesture is mathematically verified before it commits.

The system evaluates the circular variance of the input trajectory — the statistical deviation of the motion path from a defined directional vector. On a vibrating industrial platform, in a moving vehicle, or under environmental tremor, the signal processing layer distinguishes an intentional directional event from noise before any classification fires. This is not a swipe detection. It is a verified kinematic commitment — the same input that would miscategorize in a standard touch system is filtered here before it reaches the classification layer.

Circular variance · Trajectory analysis  ·  Industrial vibration filtered  ·  Intent mathematically confirmed before payload
02 · Semantic Opacity

An intercepted payload is an unreadable sequence of integers.

The structured payload contains no natural language, no message content, and no biometric data. It is a sequence of tree node indices — integers that resolve to classification paths only when the receiving end holds the exact tree configuration. Without the configuration, there is nothing to read and nothing to attribute to an individual. The tree configuration itself functions as an implicit authentication factor: possessing it proves organizational membership at the protocol level, without requiring a separate authentication exchange.

Tree-config dependent · Zero biometric  ·  Semantically opaque without configuration  ·  Tree config = implicit auth factor
03 · Duress Architecture

The operator under duress looks identical to the operator who isn't.

A configurable duress code authenticates normally — identical UI, identical latency, identical transmitted payload structure — while embedding a silent flag at the protocol level. The receiving system acts on the flag without any observable change at the operator's end. Standard authentication flows that require behavioral deviation (a code word, a different action, a visible signal) are replaced by a protocol that makes duress and non-duress states cryptographically indistinguishable to any observer, including an adversary watching the operator's screen.

Identical UI · Silent duress flag  ·  No observable behavioral difference  ·  Auth flows intact under threat
04 · Bidirectional Loop (H2S)

The machine can query the human. The gesture closes the loop.

The protocol is not unidirectional. An external system can push a software-originated event that surfaces as a directional classification prompt on the operator's device. The operator's gesture produces a structured, machine-readable response that routes directly back to the originating system and triggers a downstream workflow — without natural language, without transcription, without NLP. A machine queries a human and receives a verified, classified, logged response in under one second. This closes the notification loop that every prior system leaves open.

H2S integration · Software-originated prompts  ·  Gesture closes notification loop  ·  No NLP required for response routing

Every role has this moment. Most organizations never capture it.

The bottleneck isn't communication — it's capture. Every team has operators making decisions in conditions where every existing system breaks down. The observation happens. The data never makes it into a record.

Warehouse · Distribution

A picker spots damaged inventory on a moving line. Both hands are on product. Walking to a terminal costs 40 seconds and drops the count. Most workers skip it. The shrink goes unrecorded.

One gesture from the floor. Record created. Line never stops.
Hospital · Clinical

A nurse between procedures observes a patient deterioration indicator. The nearest terminal is occupied. Verbal hand-off happens sometimes. The EHR stays blank.

Gesture while walking. Structured EHR entry. No terminal required.
Fleet · Transportation

A driver spots a road hazard — flooding, downed lines, fresh debris. Hands on wheel, clock running. The dispatch system requires five taps to log a custom note. The hazard reaches the next driver unannounced.

Thumb gesture on the wheel. Hazard classified and routed instantly.
Retail · Customer Service

A cashier observes a theft event mid-transaction. Speaking alerts the subject. Reaching for a radio breaks service. The 30-second window closes without a record.

Silent pocket gesture. Loss prevention notified. Customer never notices.
Construction · Field

An ironworker on an elevated platform notices a load discrepancy. Both hands on rigging. The observation gets reported — 20 minutes later, from memory, after descent.

Wrist gesture at height. Safety flag created at the moment of observation.
Education · K–12

A teacher mid-lesson observes a behavioral incident requiring documentation. Interrupting the class escalates what it's supposed to record. It gets written up after school, from recollection.

Discreet gesture mid-lesson. Incident logged contemporaneously, not reconstructed.
Precision Manufacturing · Assembly

A line technician running tool validation on a Power Focus controller needs to log a calibration pass/fail without releasing the hardware or shifting focus to a terminal. Pulling up a form on a touchscreen is a contamination risk.

Gesture while holding the tool. Pass/fail classified, logged to central system, hands never leave hardware.
Cybersecurity · Physical Red Team

A physical penetration tester or hardware auditor reaches a network node or access point during an assessment. Logging the breach marker silently — with zero acoustic signature and no visible device interaction — is operationally essential.

Pocket gesture. Node status classified, timestamped, and logged. No observable behavior.
In every case, the data isn't lost because the person didn't observe it. It's lost because capturing it required stopping. propOgate doesn't require stopping — the gesture is the record, created the instant the decision is made.

4 gestures. 256 possible routes. <0.3 seconds.

Select an industry, then make 4 directional gestures. Each level narrows the classification — 4 choices deep produces a fully classified, routed, and logged intelligence record the instant you finish.

Drag to navigate
Arrow keys also work  ·  Release to select
Classification Path
Make 4 directional selections to transmit
4 directions per level
×
4 gesture levels
=
256 possible classifications
·
<0.3ms transmission

Every high-stakes environment has this moment.

The operator cannot speak. Cannot type. Cannot wait. Every industry has this moment. No system built before propOgate was designed for it.

01 · Defense & Military

Battlefield Intelligence

EMCON · DDIL · SOF · PACE · C2

A patrol leader is 40 meters from contact. Radio is jammed. Three casualties. The reporting tablet requires 14 fields before it routes a 9-Line MEDEVAC. Speaking is a security failure. The TOC has no picture.

One gesture: [Sector · IED · Casualties · Priority 1]. Classified, routed to TOC, logged in the ops record. No voice. No form. Command knows in 0.3 seconds.
02 · Security & Law Enforcement

Covert Situational Reporting

Undercover · Close Protection · Surveillance · Dispatch

An undercover officer makes contact with a high-value target in a crowded venue. Breaking cover to call in is not an option. Radio chatter is a tell. The clock is running.

A silent gesture through a jacket pocket: contact status classified, handler notified, timestamp logged — without a word, without a visible action.
03 · Intelligence & Covert Ops

Zero-Signature Reporting

Field Intelligence · Asset Handling · Collection Ops

A field officer confirms a high-value target. The method must leave no acoustic signature, no observable behavior, no digital trail that links action to intent.

400 bytes, AES-256-GCM encrypted, semantically opaque without the tree. A sequence of node indices — meaningless to anyone intercepting it.
04 · Medical

Hands-Occupied Clinical Reporting

Surgical · Emergency · ICU · EMS · Trauma

A surgeon's hands are occupied. Breaking sterile field to touch a tablet is not possible. Speaking exposes PHI in an ambient-mic environment. The OR team needs to know what just happened.

Eye-tracking or foot-pedal gesture classifies the event, notifies the team, writes the structured EHR record — sterile field intact, no verbal PHI, AI-ready data born at the decision point.
05 · Industrial

Eyes-Up Safety Reporting

Manufacturing · Energy · Construction · Mining

A plant operator in a 105dB environment notices a critical equipment fault. Both hands are on the machine. Looking away from the hazard to file a report is itself a safety risk.

Wrist gesture or glove controller: incident classified, safety team routed, compliance record created — eyes never left the hazard. Zero free text, zero interpretation required.
06 · Espionage & Field Intelligence

Zero-Observable Communication

NOC · Station · Asset Handling · Collection · Exfil

An asset embedded in a foreign ministry confirms a high-value document transfer. The contact window is 90 seconds. Any observable action — reaching for a phone, mouthing words, looking away — ends the operation. Possibly the life.

A gesture through a coat pocket transmits a 400-byte semantically opaque payload, AES-256-GCM encrypted. The handler receives structured intelligence. The source shows nothing. No words spoken. No action visible.

Three reasons enterprises are paying attention.

This is not a messaging application. It is a structured data capture protocol that transmits in real time. The distinction matters at scale.

01 · AI Readiness

Your AI needs labeled data. Most of yours doesn't exist yet.

The bottleneck for every enterprise AI initiative is not the model — it's the data. Specifically, ground-truth, timestamped, labeled records of what actually happened in the field. Most organizations have captured less than 10% of that data in any structured form. propOgate creates it automatically, as a byproduct of normal operations. Every gesture is a labeled example born at the moment of decision, in the exact taxonomy the organization uses. No ETL. No NLP pipeline. No normalization layer required.

02 · Compliance & Ops

Every form not filed is a liability that hasn't surfaced yet.

Near-miss reports. Incident logs. Clinical observations. Safety checklists. The compliance infrastructure most organizations maintain assumes someone will stop, locate a system, and fill out a form. In practice, that doesn't happen consistently — not in the field, not under pressure. propOgate makes reporting the natural outcome of the action, not a separate task performed afterward. When friction drops to zero, compliance rates follow. The record exists because the event happened, not because someone remembered to document it.

03 · Reach & Sovereignty

The environments where data matters most are where every system fails.

No prior communication protocol was designed for a sterile surgical field, a denied-network combat zone, a 105dB factory floor, active heavy machinery, or a covert field operation. Every existing solution either requires conditions that don't exist in those environments, or imposes overhead that breaks the task it's supposed to support. propOgate is the only system designed specifically for operators who cannot stop, cannot speak, and cannot wait. That's not a niche — it's every organization's hardest 20% of situations, generating their most consequential data. And the 400-byte burst routes entirely within closed networks — no cloud dependency, no data residency risk.

Sovereign
Data Capture

The 400-byte burst routes entirely within closed-loop, air-gapped architectures — never touching a public cloud. The payload is semantically opaque: a sequence of tree node indices that carries no natural language, no biometric data, and no meaning without the receiving system's tree configuration. Intercepted, it is unreadable. Designed for classified DoD networks, HIPAA-governed clinical systems, EU GDPR local-processing mandates, ITAR-controlled programs, and sovereign AI facilities running local-first infrastructure.

For the first time, the field creates data the AI doesn't have to translate.

Every AI that processes field data faces the same problem: humans don't communicate in machine-readable formats. propOgate eliminates the translation layer entirely.

Every prior system — what AI must do first
1Receive raw input — audio, text, or semi-structured form data
2Transcribe or parse the input format
3Run NLP to extract meaning, intent, and entities
4Classify the event type, severity, and priority
5Determine routing and write the log entry
6Act on the result — if the pipeline didn't fail
Lag. Pipeline failure risk. AI operating on stale or incomplete data.
propOgate — what AI receives
1Receive structured payload: [industry · event type · location · priority · timestamp] — already classified, routed, and logged
Act immediately. No preprocessing. No pipeline. Real-time intelligence from the moment of human decision.

No NLP Required

The tree node selection IS the classification. The AI doesn't interpret — it receives structured data that was classified at the exact moment of human decision. No language model needed to understand what happened.

Real-Time Briefing

Because data is structured natively, an AI commander brief or clinical dashboard updates the instant a gesture happens — not after transcription, not after an analyst reviews it. The intelligence is live.

Native Analytics — Zero ETL

The classification tree IS the analytics schema. Every report is already categorized, dimensional, and queryable the moment it's created. No pipeline to build. No reconciliation job to run. No lag between event and insight.

Atomic Audit Trail

Every decision is logged in the same atomic write as the classification and routing. Complete fidelity — no reconstruction, no reconciliation. The AI can replay any operational moment exactly as it happened.

Nothing else does all three simultaneously.

Every prior system treats classification, routing, and logging as separate steps. In propOgate they are structurally inseparable — the same action, the same instant.

Capability prop○gate Radio / Voice Digital Forms Encrypted Messaging
When structured data exists At the moment of decision ✕ After transcription + classification ✕ After submission + ETL ✕ After manual classification
Classify + Route + Log in one action Inseparable by design ✕ Three separate steps ✕ Three separate steps ✕ Manual classification required
AI-ready without preprocessing Structured natively ✕ Requires ASR + NLP ✕ Requires ETL + normalization ✕ Requires NLP + classification
Zero acoustic signature Silent, no voice required ✕ Requires voice ✕ Typing audible ✕ Typing audible
Works in denied / degraded environments 400-byte burst ✕ Fails when jammed ✕ Requires network ✕ Requires network
Hands-occupied / eyes-free operation Eye, BCI, physical controller ✕ Requires PTT ✕ Both hands + eyes ✕ Both hands + eyes
Built-in duress / silent distress Identical UI, silent flag ✕ No mechanism ✕ No mechanism ✕ No mechanism
Gesture verified against environmental tremor Circular variance filtering ✕ Voice affected by noise ✕ No filtering layer ✕ No filtering layer
Payload readable without configuration Semantically opaque · zero biometric ✕ Voice is fully readable ✕ Form content is readable ✕ Metadata reveals structure
Machine-to-human query / response loop H2S · gesture closes loop · no NLP ✕ No structured response ✕ No structured response ✕ Free text only

Methods and Systems for Directional-Based Classified Communication

Provisional patent application filed April 22, 2026. The triple-action selection mechanism, semantic opacity model, modality-agnostic input processing, duress detection architecture, and unified classification-routing-analytics tree structure are covered under patent-pending IP.

Patent Pending
Application #589695761 · Filed 2026-04-22
propogate

Ready to see it working?

Live demo across all six industries. On device. 30 minutes. No slides.

Or reach out directly  ·  info@propogate.com