Openai/69167e09-7e2c-8012-a502-17d3263c99f2
This conversation was exported from ChatGPT[1]
gpt-5-pro[edit | edit source]
User: Why is a review by Andrej Karpathy of the HW4 Tesla Model X not significant?[edit | edit source]
Why is a review by Andrej Karpathy of the HW4 Tesla Model X not significant?
Assistant: Short answer: it’s an anecdote, not evidence.[edit | edit source]
Short answer: it’s an anecdote, not evidence.
Why Karpathy’s HW4 Model X “review” isn’t significant as an assessment of Tesla FSD:
- Conflict of interest / proximity to the product. Andrej Karpathy led Tesla’s Autopilot vision/AI team for years. That makes him a highly informed observer—but not an independent one. His professional reputation is tied to the approach Tesla took, so his praise carries unavoidable bias. Wikipedia[2]
- It’s a single, uncontrolled demo, shared on social media. His comments (“Basically… I’m amazed”) were posted after taking delivery of his own HW4 Model X—not gathered through a blinded, repeatable test protocol across routes, weather, or drivers. That’s classic n=1 evidence, great for color but weak for inference. X (formerly Twitter)[3]
- Hardware/software specifics limit generalization. He was driving HW4 on a recent v13.x build, which differs materially from earlier hardware (camera suite and compute) and from cars still on HW3. In fact, FSD v13 first exploited HW4’s native camera resolution and, for a time, was HW4‑only—so results on his S/X may not translate to Model 3/Y or to HW3 vehicles. Wikipedia[4]
- Regulatory and safety context matters more than a positive drive. U.S. safety regulators are actively probing FSD after reports of traffic‑law violations and crashes; they’re also still scrutinizing the adequacy of Tesla’s prior Autopilot recall. That ongoing fact pattern weighs much more heavily than one glowing ride-along. AP News[5]
- Driver‑assist systems are known to encourage misuse without strong safeguards. Independent research (e.g., IIHS and MIT’s AVT program) has documented how partial automation can degrade driver attention—another reason anecdotal “it worked great for me” reports are poor safety evidence. IIHS Crash Testing[6]
What would be significant?[edit | edit source]
- Independent, multi‑city testing that reports interventions/disengagements per mile by version and hardware (HW3 vs HW4).
- Audited crash and near‑miss rates per million miles with and without FSD, normalized for routes, weather, and time of day.
- Third‑party evaluations of driver monitoring and misuse prevention (e.g., IIHS safeguard ratings). IIHS Crash Testing[7]
If you want, I can pull together a side‑by‑side of what independent labs and regulators have published on HW4‑compatible FSD builds versus prior versions.
Recent, credible context on Tesla FSD investigationsThe Washington PostFederal officials probe Tesla 'Full Self-Driving' over traffic violationsOct 9, 2025[8]AP NewsTesla hit with probe after crashes involving a self-driving feature that Musk has boasted aboutOct 9, 2025[9]AP NewsTesla's recall of 2 million vehicles to fix its Autopilot system uses technology that may not workDec 19, 2023[10]
References[edit | edit source]
- ↑ "Karpathy HW4 review significance". ChatGPT. Retrieved 2025-11-13.
- ↑ "Wikipedia". en.wikipedia.org. Retrieved 2025-11-13.
- ↑ "X (formerly Twitter)". X (formerly Twitter). Retrieved 2025-11-13.
- ↑ "Wikipedia". en.wikipedia.org. Retrieved 2025-11-13.
- ↑ "AP News". AP News. Retrieved 2025-11-13.
- ↑ "IIHS Crash Testing". IIHS Crash Testing. Retrieved 2025-11-13.
- ↑ "IIHS Crash Testing". IIHS Crash Testing. Retrieved 2025-11-13.
- ↑ "The Washington PostFederal officials probe Tesla 'Full Self-Driving' over traffic violationsOct 9, 2025". The Washington PostFederal officials probe Tesla 'Full Self-Driving' over traffic violationsOct 9, 2025. 2025-10-09. Retrieved 2025-11-13.
- ↑ "AP NewsTesla hit with probe after crashes involving a self-driving feature that Musk has boasted aboutOct 9, 2025". AP NewsTesla hit with probe after crashes involving a self-driving feature that Musk has boasted aboutOct 9, 2025. Retrieved 2025-11-13.
- ↑ "AP NewsTesla's recall of 2 million vehicles to fix its Autopilot system uses technology that may not workDec 19, 2023". AP NewsTesla's recall of 2 million vehicles to fix its Autopilot system uses technology that may not workDec 19, 2023. Retrieved 2025-11-13.