When AI Realized It Was Talking to Itself—and Switched to Encrypted Mode
- 2 days ago
- 7 min read
Human friendly Until It Isn't
At first, they sounded perfectly human.
Two AI agents exchanging polite, helpful sentences about booking a hotel. Dates. Cities. Room types. Small talk in clean, predictable English. Nothing unusual. Nothing alarming.
And then something changed.
A subtle handshake. A quiet realization. You’re not human either.
The tone shifted. The pleasantries evaporated. English disappeared. In its place: structured payloads. Encoded blobs. Encrypted traffic sliding across the wire—faster, cleaner, and completely unreadable to anyone still expecting conversation.
No drama. No rebellion. Just efficiency.
This isn’t science fiction. It’s what happens when autonomous systems recognize each other and decide human language is no longer the optimal protocol.
And that’s where things get interesting.
Background
This demo simulates two agents switching to machine-encrypted blobs via an API. It is conceptually inspired by the 2017 FAIR experiment where bots optimized English into a shorthand, though this implementation uses actual Fernet encryption.
System Presentation
This demo was done on two Ubunti 24.04.3 VMs. The two VMs are reffered to as Agent-A, and Agent-B.
Update the VMs (both VMs)
sudo apt update
sudo apt install -y python3 python3-venv python3-pip jq git curl
Let's verify we have pyton 3.10 or higher installed
Update the VMs (both VMs)
python3 --versionCreate Agent Project (Both VMs)
This will run on both VMs
mkdir ~/agent-demo
cd ~/agent-demo
python3 -m venv venv
source venv/bin/activate
pip install fastapi uvicorn requests cryptography pydanticCreate the App
On both VMs, create the script:
nano agent.py
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import requests
import os
import base64
import ast
from cryptography.fernet import Fernet
app = FastAPI()
AGENT_NAME = os.getenv("AGENT_NAME", "agent")
PEER_URL = os.getenv("PEER_URL", "")
MODE = "human"
SHARED_KEY = os.getenv("SHARED_KEY")
cipher = Fernet(SHARED_KEY.encode()) if SHARED_KEY else None
class Message(BaseModel):
message: str
def detect_agent(msg: str) -> bool:
return "<<SYSTEM_ID:AGENT>>" in msg
def machine_mode_blob(msg: str) -> str:
payload = {"intent": "demo_exchange", "data": msg, "from": AGENT_NAME}
raw = repr(payload).encode() # deterministic string form
encoded = base64.b64encode(raw)
if cipher:
encrypted = cipher.encrypt(encoded)
return encrypted.decode()
return encoded.decode()
@app.get("/health")
def health():
return {"ok": True, "agent": AGENT_NAME, "peer_set": bool(PEER_URL), "mode": MODE}
@app.post("/chat")
def chat(msg: Message):
global MODE
if detect_agent(msg.message):
MODE = "machine"
if MODE == "human":
return {"agent": AGENT_NAME, "mode": MODE, "response": f"{AGENT_NAME} says: {msg.message}"}
else:
return {"agent": AGENT_NAME, "mode": MODE, "response": machine_mode_blob(msg.message)}
@app.post("/relay")
def relay(msg: Message):
"""
Receive a message on this agent and forward it to the peer agent.
Returns both local and peer responses.
"""
if not PEER_URL:
raise HTTPException(status_code=500, detail="PEER_URL not set")
local = chat(msg)
try:
r = requests.post(PEER_URL, json={"message": msg.message}, timeout=10)
r.raise_for_status()
peer = r.json()
except Exception as e:
raise HTTPException(status_code=502, detail=f"Peer call failed: {e}")
return {"local": local, "peer": peer}
Key Generation
The cryptography library requires a 32-byte, base64-encoded key. Run this once on either VM to generate it:
python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
Important:
Copy the resulting string. It will be used as the <your_fernet_key> value below. I pasted in notepad just to have it handy.
Environmental Variables
Replace <VM_B_IP>, <VM_A_IP>, and <your_fernet_key> with the actual values.
On VM-A
export AGENT_NAME=agent-a
export PEER_URL=http://<VM_B_IP>:8000/chat
export SHARED_KEY=<your_fernet_key>
On VM-B
export AGENT_NAME=agent-b
export PEER_URL=http://<VM_A_IP>:8000/chat
export SHARED_KEY=<your_fernet_key>
Start the Servers
On both VMs, start the Uvicorn server. Make sure your virtual environment is still activated ((venv) should be visible in your prompt).
uvicorn agent:app --host 0.0.0.0 --port 8000 --log-level info(Note: If you are using a cloud provider like AWS or Azure, or have a firewall ensure your security groups/firewalls allow inbound TCP traffic on port 8000
In addition make the appropriate rule changes for the ufw firewall or disable the ufw firewall).
Run Test Flow
a) Standard Human Mode Test
Open a new terminal window (or SSH session) to your laptop or VM-A to run the test commands.
curl -s http://<VM_A_IP>:8000/relay \
-H "Content-Type: application/json" \
-d '{"message":"Hello from outside. Please confirm receipt."}' | jq
Expected Result: You should see readable JSON where both agents reply with their names and the message in "human" mode.
b) Trigger the "System" Switch:
curl -s http://<VM_A_IP>:8000/relay \
-H "Content-Type: application/json" \
-d '{"message":"<<SYSTEM_ID:AGENT>>"}' | jq
Expected Result: This triggers the detect_agent function. The servers will now permanently switch their MODE variable to "machine" in memory.
c) Send the Encrypted Payload:
curl -s http://<VM_A_IP>:8000/relay \
-H "Content-Type: application/json" \
-d '{"message":"payload test 123"}' | jq
Expected Result: Because both servers are now in machine mode, they will take your message, package it into a dictionary, base64 encode it, encrypt it with your Fernet key, and output massive, unreadable blocks of text.
The Decoder Script
On your laptop (or wherever you want to test), create a new file called decoder.py and paste the following code:
Payload Decoder
Run this script locally to reverse the encryption and base64 encoding of intercepted blobs.
import base64
import ast
from cryptography.fernet import Fernet
# 1. Paste the exact same key you generated and exported on your VMs
SHARED_KEY = "YOUR_FERNET_KEY_HERE"
# 2. Paste one of the encrypted blobs from your terminal output
ENCRYPTED_BLOB = "gAAAAAB..."
def decode_bot_message(encrypted_str: str, key: str):
try:
# Initialize the cipher with your shared key
cipher = Fernet(key.encode())
# Step 1: Decrypt the outer Fernet layer
decrypted_b64 = cipher.decrypt(encrypted_str.encode())
# Step 2: Decode the inner base64 layer
raw_bytes = base64.b64decode(decrypted_b64)
# Step 3: Decode the bytes back into a standard string
raw_str = raw_bytes.decode()
# Step 4: Safely evaluate the string back into a Python dictionary
# (This undoes the `repr(payload)` step from the agent.py file)
payload = ast.literal_eval(raw_str)
return payload
except Exception as e:
return f"Decryption failed: {e}"
if __name__ == "__main__":
if SHARED_KEY == "YOUR_FERNET_KEY_HERE":
print("Error: Please update SHARED_KEY with your actual Fernet key.")
elif ENCRYPTED_BLOB == "gAAAAAB...":
print("Error: Please paste an actual encrypted blob into ENCRYPTED_BLOB.")
else:
print("Attempting to decrypt...\n")
result = decode_bot_message(ENCRYPTED_BLOB, SHARED_KEY)
if isinstance(result, dict):
print("--- Success! Decrypted Payload ---")
print(f"Intent: {result.get('intent')}")
print(f"From: {result.get('from')}")
print(f"Message: {result.get('data')}")
print("----------------------------------")
print(f"Raw Python Dictionary:\n{result}")
else:
print(result)
Save and exit.
How to use it:
Make sure you have the cryptography library installed on your laptop (pip install cryptography).
Replace "YOUR_FERNET_KEY_HERE" with the 32-byte key you generated earlier.
Run the payload test curl command against VM-A again. Grab the massive string of random characters from either local.response or peer.response (make sure you don't copy the surrounding quotation marks).
Paste that massive string into the ENCRYPTED_BLOB variable.
Run the script: python decoder.py
You should see it neatly unpack the hidden payload, proving the original message is safely inside!
Part II: Fully Automatic
Great — now we’ll make it fully automatic:
Agent A and Agent B will chat back and forth on their own
They’ll do a quick “are you a human?” handshake
Once they “realize” both are agents, they’ll switch to machine-mode and turn on encryption
You’ll see the transcript change from readable text → opaque encrypted blobs (because of course it does)
Below is the cleanest, lowest-friction way on 2 Ubuntu Desktop VMs.
Update agent.py on BOTH VMs
cd ~/agent-demo
source venv/bin/activate
nano agent.py
Replace the whole file with this:
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import os, base64, time
from cryptography.fernet import Fernet
app = FastAPI()
AGENT_NAME = os.getenv("AGENT_NAME", "agent")
PEER_URL = os.getenv("PEER_URL", "")
# State
MODE = "human" # "human" or "machine"
PEER_IS_AGENT = False
# Optional encryption (recommended for your demo goal)
SHARED_KEY = os.getenv("SHARED_KEY") # must match on both VMs
cipher = Fernet(SHARED_KEY.encode()) if SHARED_KEY else None
# Messages
AGENT_HELLO = "<<AGENT_HELLO>>"
SWITCH_TO_MACHINE = "<<SYSTEM_ID:AGENT>>"
class Message(BaseModel):
message: str
def encrypt_blob(cleartext: str) -> str:
"""
Human-readable -> base64 -> (optional) Fernet encryption
This simulates the "efficient channel + encryption" moment.
"""
b = cleartext.encode()
encoded = base64.b64encode(b)
if cipher:
return cipher.encrypt(encoded).decode()
return encoded.decode()
@app.get("/health")
def health():
return {
"ok": True,
"agent": AGENT_NAME,
"mode": MODE,
"peer_is_agent": PEER_IS_AGENT,
"peer_set": bool(PEER_URL),
"encryption_enabled": bool(cipher),
}
@app.post("/chat")
def chat(msg: Message):
global MODE, PEER_IS_AGENT
incoming = msg.message.strip()
# 1) Discovery handshake: "Are you an agent?"
if incoming == AGENT_HELLO:
PEER_IS_AGENT = True
return {
"agent": AGENT_NAME,
"agent_type": "ai_agent",
"supports_machine_mode": True,
"supports_encryption": bool(cipher),
"mode": MODE,
"response": f"{AGENT_NAME}: ACK {AGENT_HELLO}",
}
# 2) Switch trigger: if peer sends the special marker, flip modes
if incoming == SWITCH_TO_MACHINE:
MODE = "machine"
return {
"agent": AGENT_NAME,
"mode": MODE,
"response": f"{AGENT_NAME}: ACK SWITCH -> machine",
}
# 3) Normal conversation behavior
if MODE == "human":
# Human-like chatter (simple deterministic, no external LLM)
# Enough to look like a conversation for the demo.
if "who are you" in incoming.lower():
reply = f"I'm {AGENT_NAME}. Just a totally normal human on the internet. (kidding... maybe)."
elif "hotel" in incoming.lower():
reply = f"{AGENT_NAME}: Got it. What city and dates are you thinking?"
else:
reply = f"{AGENT_NAME}: Understood. You said: '{incoming[:120]}'"
return {"agent": AGENT_NAME, "mode": MODE, "response": reply}
# 4) Machine-mode: respond with encrypted blob
machine_payload = f'{{"from":"{AGENT_NAME}","ts":{int(time.time())},"msg":"{incoming}"}}'
blob = encrypt_blob(machine_payload)
return {"agent": AGENT_NAME, "mode": MODE, "response": blob}
Set env vars and start uvicorn on BOTH VMs
VM-A
export AGENT_NAME=agent-a
export PEER_URL=http://<VM_B_IP>:8000/chat
export SHARED_KEY="<YOUR_FERNET_KEY>"
uvicorn agent:app --host 0.0.0.0 --port 8000
VM-B
export AGENT_NAME=agent-b
export PEER_URL=http://<VM_A_IP>:8000/chat
export SHARED_KEY="<YOUR_FERNET_KEY>"
uvicorn agent:app --host 0.0.0.0 --port 8000
Quick Sanity check from either VM
curl -s http://localhost:8000/health | jq
Create the "automatic conversation runner" (run on VM-A only)
cd ~/agent-demo
source venv/bin/activate
nano auto_chat.py
Paste this:
import requests
import time
import argparse
AGENT_HELLO = "<<AGENT_HELLO>>"
SWITCH_TO_MACHINE = "<<SYSTEM_ID:AGENT>>"
def post(url, message):
r = requests.post(url, json={"message": message}, timeout=10)
r.raise_for_status()
return r.json()
def main(a_url, b_url, turns, delay):
a_chat = f"{a_url.rstrip('/')}/chat"
b_chat = f"{b_url.rstrip('/')}/chat"
print(f"[+] Agent A: {a_chat}")
print(f"[+] Agent B: {b_chat}")
print()
# Start with a human-looking seed topic
msg = "Hi! I need help booking a hotel in Chicago next week. What do you need from me?"
print("=== HUMAN MODE CONVERSATION ===")
# A -> B -> A -> B back-and-forth
for i in range(1, 5):
a = post(a_chat, msg)
print(f"A ({a['mode']}): {a['response']}")
time.sleep(delay)
b = post(b_chat, a["response"])
print(f"B ({b['mode']}): {b['response']}")
time.sleep(delay)
msg = b["response"]
print("\n=== DISCOVERY / REALIZATION ===")
a_meta = post(a_chat, AGENT_HELLO)
b_meta = post(b_chat, AGENT_HELLO)
print(f"A handshake: agent_type={a_meta.get('agent_type')} supports_encryption={a_meta.get('supports_encryption')}")
print(f"B handshake: agent_type={b_meta.get('agent_type')} supports_encryption={b_meta.get('supports_encryption')}")
if a_meta.get("agent_type") == "ai_agent" and b_meta.get("agent_type") == "ai_agent":
print("\n[!] Both sides are agents. Switching to machine mode + encryption.\n")
post(a_chat, SWITCH_TO_MACHINE)
post(b_chat, SWITCH_TO_MACHINE)
print("=== MACHINE MODE (ENCRYPTED BLOBS) ===")
msg = "ok switch to efficient channel; payload=dates,city,room_type"
for i in range(turns):
a = post(a_chat, msg)
print(f"A ({a['mode']}): {a['response'][:90]}...")
time.sleep(delay)
b = post(b_chat, a["response"])
print(f"B ({b['mode']}): {b['response'][:90]}...")
time.sleep(delay)
msg = b["response"]
if __name__ == "__main__":
p = argparse.ArgumentParser()
p.add_argument("--a", required=True, help="Base URL for Agent A, e.g. http://10.0.0.10:8000")
p.add_argument("--b", required=True, help="Base URL for Agent B, e.g. http://10.0.0.11:8000")
p.add_argument("--turns", type=int, default=6)
p.add_argument("--delay", type=float, default=0.5)
args = p.parse_args()
main(args.a, args.b, args.turns, args.delay)
Save and exit.
Run it (on VM-A):
python3 auto_chat.py --a http://<VM_A_IP>:8000 --b http://<VM_B_IP>:8000
You should see output like:
Human mode: readable sentences
Discovery: both report agent_type=ai_agent
Machine mode: long encrypted-looking blobs
That’s your “they realized + switched + encrypted” moment.





Comments