Finite numerical precision in floating-point representations triggers a chaotic "avalanche effect" within early Transformer layers. This research identifies how minor rounding errors amplify into binary outcomes, creating unpredictable model behavior. These instabilities compromise reliability in agentic workflows. Practitioners must now account for hardware-level precision when auditing LLM consistency and numerical stability.