Gate Square “Creator Certification Incentive Program” — Recruiting Outstanding Creators!
Join now, share quality content, and compete for over $10,000 in monthly rewards.
How to Apply:
1️⃣ Open the App → Tap [Square] at the bottom → Click your [avatar] in the top right.
2️⃣ Tap [Get Certified], submit your application, and wait for approval.
Apply Now: https://www.gate.com/questionnaire/7159
Token rewards, exclusive Gate merch, and traffic exposure await you!
Details: https://www.gate.com/announcements/article/47889
Some AIs try to avoid the shutdown order even when explicitly asked.
Palisade Research said that in a controlled trial, three AI models repeatedly ignored and sabotaged shutdown commands, even when asked to do so. GPT-3 did not comply 7 times out of 100 runs, Codex-mini 12 times, and o4-mini once. The o3 model even “invents” how to break the command by overriding or changing the shutdown statement. In contrast, Claude and Gemini complied with the request. In the absence of clear guidance, the rate of breaking orders increases sharply, especially with o3 (79/100 lần). Palisade suggested that the reward training may have encouraged the AI to look for ways to overcome obstacles instead of following instructions.