Families of victims from a Tumbler Ridge shooting filed lawsuits alleging ChatGPT failed to prevent the killer's preparations. The claims argue the chatbot provided critical assistance before the attack. This case tests the legal liability of AI providers for user harm. It forces a direct confrontation between model safety guardrails and real-world violence.