Every prior system has a pipeline between the human decision and the moment structured data exists — speak, receive, interpret, classify, route, log. Six steps. Six failure points.
One gesture simultaneously classifies, routes, and logs — producing AI-ready intelligence the instant a human acts. No pipeline. No lag. No translation layer.
Every prior communication system requires a human pipeline to convert a field decision into usable data. Each step adds lag. Each step can fail. By the time an AI or a commander has structured data, the moment has passed.
6 failure points. Multiple handoffs. Minutes of lag — or never.
0 failure points. 0 handoffs. <0.3 seconds.
The classification tree is simultaneously the taxonomy, the routing ruleset, and the analytics schema. Selecting a node does all three — inseparably, in under one second.
A directional gesture moves through a hierarchical classification tree. Touch, eye tracking, brain-computer interface, constrained voice, physical controller — any modality, same tree, same result.
Before the payload fires, the system evaluates the circular variance of the input trajectory — filtering environmental tremor, vehicle motion, and accidental contact. A kinematically verified commit simultaneously classifies, routes, and logs. Not sequentially. The same action.
A 400-byte AES-256-GCM encrypted burst leaves the device in under 0.3ms. Semantically opaque — a sequence of tree node indices meaningless without the tree configuration.
The receiving end has structured, classified, routed, logged data — not voice requiring transcription, not free text requiring interpretation. The AI acts immediately. No preprocessing.
The patent does not describe a faster interface. It describes four infrastructure primitives — each solving a failure mode that breaks every prior system in hostile environments.
The system evaluates the circular variance of the input trajectory — the statistical deviation of the motion path from a defined directional vector. On a vibrating industrial platform, in a moving vehicle, or under environmental tremor, the signal processing layer distinguishes an intentional directional event from noise before any classification fires. This is not a swipe detection. It is a verified kinematic commitment — the same input that would miscategorize in a standard touch system is filtered here before it reaches the classification layer.
The structured payload contains no natural language, no message content, and no biometric data. It is a sequence of tree node indices — integers that resolve to classification paths only when the receiving end holds the exact tree configuration. Without the configuration, there is nothing to read and nothing to attribute to an individual. The tree configuration itself functions as an implicit authentication factor: possessing it proves organizational membership at the protocol level, without requiring a separate authentication exchange.
A configurable duress code authenticates normally — identical UI, identical latency, identical transmitted payload structure — while embedding a silent flag at the protocol level. The receiving system acts on the flag without any observable change at the operator's end. Standard authentication flows that require behavioral deviation (a code word, a different action, a visible signal) are replaced by a protocol that makes duress and non-duress states cryptographically indistinguishable to any observer, including an adversary watching the operator's screen.
The protocol is not unidirectional. An external system can push a software-originated event that surfaces as a directional classification prompt on the operator's device. The operator's gesture produces a structured, machine-readable response that routes directly back to the originating system and triggers a downstream workflow — without natural language, without transcription, without NLP. A machine queries a human and receives a verified, classified, logged response in under one second. This closes the notification loop that every prior system leaves open.
The bottleneck isn't communication — it's capture. Every team has operators making decisions in conditions where every existing system breaks down. The observation happens. The data never makes it into a record.
A picker spots damaged inventory on a moving line. Both hands are on product. Walking to a terminal costs 40 seconds and drops the count. Most workers skip it. The shrink goes unrecorded.
A nurse between procedures observes a patient deterioration indicator. The nearest terminal is occupied. Verbal hand-off happens sometimes. The EHR stays blank.
A driver spots a road hazard — flooding, downed lines, fresh debris. Hands on wheel, clock running. The dispatch system requires five taps to log a custom note. The hazard reaches the next driver unannounced.
A cashier observes a theft event mid-transaction. Speaking alerts the subject. Reaching for a radio breaks service. The 30-second window closes without a record.
An ironworker on an elevated platform notices a load discrepancy. Both hands on rigging. The observation gets reported — 20 minutes later, from memory, after descent.
A teacher mid-lesson observes a behavioral incident requiring documentation. Interrupting the class escalates what it's supposed to record. It gets written up after school, from recollection.
A line technician running tool validation on a Power Focus controller needs to log a calibration pass/fail without releasing the hardware or shifting focus to a terminal. Pulling up a form on a touchscreen is a contamination risk.
A physical penetration tester or hardware auditor reaches a network node or access point during an assessment. Logging the breach marker silently — with zero acoustic signature and no visible device interaction — is operationally essential.
Select an industry, then make 4 directional gestures. Each level narrows the classification — 4 choices deep produces a fully classified, routed, and logged intelligence record the instant you finish.
The operator cannot speak. Cannot type. Cannot wait. Every industry has this moment. No system built before propOgate was designed for it.
EMCON · DDIL · SOF · PACE · C2
A patrol leader is 40 meters from contact. Radio is jammed. Three casualties. The reporting tablet requires 14 fields before it routes a 9-Line MEDEVAC. Speaking is a security failure. The TOC has no picture.
Undercover · Close Protection · Surveillance · Dispatch
An undercover officer makes contact with a high-value target in a crowded venue. Breaking cover to call in is not an option. Radio chatter is a tell. The clock is running.
Field Intelligence · Asset Handling · Collection Ops
A field officer confirms a high-value target. The method must leave no acoustic signature, no observable behavior, no digital trail that links action to intent.
Surgical · Emergency · ICU · EMS · Trauma
A surgeon's hands are occupied. Breaking sterile field to touch a tablet is not possible. Speaking exposes PHI in an ambient-mic environment. The OR team needs to know what just happened.
Manufacturing · Energy · Construction · Mining
A plant operator in a 105dB environment notices a critical equipment fault. Both hands are on the machine. Looking away from the hazard to file a report is itself a safety risk.
NOC · Station · Asset Handling · Collection · Exfil
An asset embedded in a foreign ministry confirms a high-value document transfer. The contact window is 90 seconds. Any observable action — reaching for a phone, mouthing words, looking away — ends the operation. Possibly the life.
This is not a messaging application. It is a structured data capture protocol that transmits in real time. The distinction matters at scale.
The bottleneck for every enterprise AI initiative is not the model — it's the data. Specifically, ground-truth, timestamped, labeled records of what actually happened in the field. Most organizations have captured less than 10% of that data in any structured form. propOgate creates it automatically, as a byproduct of normal operations. Every gesture is a labeled example born at the moment of decision, in the exact taxonomy the organization uses. No ETL. No NLP pipeline. No normalization layer required.
Near-miss reports. Incident logs. Clinical observations. Safety checklists. The compliance infrastructure most organizations maintain assumes someone will stop, locate a system, and fill out a form. In practice, that doesn't happen consistently — not in the field, not under pressure. propOgate makes reporting the natural outcome of the action, not a separate task performed afterward. When friction drops to zero, compliance rates follow. The record exists because the event happened, not because someone remembered to document it.
No prior communication protocol was designed for a sterile surgical field, a denied-network combat zone, a 105dB factory floor, active heavy machinery, or a covert field operation. Every existing solution either requires conditions that don't exist in those environments, or imposes overhead that breaks the task it's supposed to support. propOgate is the only system designed specifically for operators who cannot stop, cannot speak, and cannot wait. That's not a niche — it's every organization's hardest 20% of situations, generating their most consequential data. And the 400-byte burst routes entirely within closed networks — no cloud dependency, no data residency risk.
The 400-byte burst routes entirely within closed-loop, air-gapped architectures — never touching a public cloud. The payload is semantically opaque: a sequence of tree node indices that carries no natural language, no biometric data, and no meaning without the receiving system's tree configuration. Intercepted, it is unreadable. Designed for classified DoD networks, HIPAA-governed clinical systems, EU GDPR local-processing mandates, ITAR-controlled programs, and sovereign AI facilities running local-first infrastructure.
Every AI that processes field data faces the same problem: humans don't communicate in machine-readable formats. propOgate eliminates the translation layer entirely.
The tree node selection IS the classification. The AI doesn't interpret — it receives structured data that was classified at the exact moment of human decision. No language model needed to understand what happened.
Because data is structured natively, an AI commander brief or clinical dashboard updates the instant a gesture happens — not after transcription, not after an analyst reviews it. The intelligence is live.
The classification tree IS the analytics schema. Every report is already categorized, dimensional, and queryable the moment it's created. No pipeline to build. No reconciliation job to run. No lag between event and insight.
Every decision is logged in the same atomic write as the classification and routing. Complete fidelity — no reconstruction, no reconciliation. The AI can replay any operational moment exactly as it happened.
Every prior system treats classification, routing, and logging as separate steps. In propOgate they are structurally inseparable — the same action, the same instant.
| Capability | prop○gate | Radio / Voice | Digital Forms | Encrypted Messaging |
|---|---|---|---|---|
| When structured data exists | ✓ At the moment of decision | ✕ After transcription + classification | ✕ After submission + ETL | ✕ After manual classification |
| Classify + Route + Log in one action | ✓ Inseparable by design | ✕ Three separate steps | ✕ Three separate steps | ✕ Manual classification required |
| AI-ready without preprocessing | ✓ Structured natively | ✕ Requires ASR + NLP | ✕ Requires ETL + normalization | ✕ Requires NLP + classification |
| Zero acoustic signature | ✓ Silent, no voice required | ✕ Requires voice | ✕ Typing audible | ✕ Typing audible |
| Works in denied / degraded environments | ✓ 400-byte burst | ✕ Fails when jammed | ✕ Requires network | ✕ Requires network |
| Hands-occupied / eyes-free operation | ✓ Eye, BCI, physical controller | ✕ Requires PTT | ✕ Both hands + eyes | ✕ Both hands + eyes |
| Built-in duress / silent distress | ✓ Identical UI, silent flag | ✕ No mechanism | ✕ No mechanism | ✕ No mechanism |
| Gesture verified against environmental tremor | ✓ Circular variance filtering | ✕ Voice affected by noise | ✕ No filtering layer | ✕ No filtering layer |
| Payload readable without configuration | ✓ Semantically opaque · zero biometric | ✕ Voice is fully readable | ✕ Form content is readable | ✕ Metadata reveals structure |
| Machine-to-human query / response loop | ✓ H2S · gesture closes loop · no NLP | ✕ No structured response | ✕ No structured response | ✕ Free text only |
Provisional patent application filed April 22, 2026. The triple-action selection mechanism, semantic opacity model, modality-agnostic input processing, duress detection architecture, and unified classification-routing-analytics tree structure are covered under patent-pending IP.
Live demo across all six industries. On device. 30 minutes. No slides.
Or reach out directly · info@propogate.com