Question from ignorance: how much is this a problem unique to Crystal not asking for enough type information, or is this also a problem in other languages with Type Inference?
You can do “perfect” type inference using Hindley-Milner in a language without “extends” inheritance or higher kinds, meaning type annotations are (at least theoretically) never necessary. That’s what ML and related languages tend to do. In more powerful type systems inference tends to be more ad-hoc and require some kind of type annotations under some circumstances.
AIUI Crystal wants to have “traditional” OO, so pure H-M doesn’t apply. This is one of the reasons functional languages tend to only allow “sum types” rather than “extends” inheritance.
The difficulty is not OO as much as it is subtyping (objects and the self-types they necessitate bring their own difficulties in addition to those). I’m actually doing research developing a functional language with subtyping which, hopefully, will solve a similar problem as Crystal, namely allow the feel of untyped programming while statically catching many errors. We’re building a sound and decidable type system (no performance numbers yet, though) based on subtype constraints and building an object-oriented layer on top of the core functional language.
ETA: link to project page
The problem is not type inference itself (technically H-M has an exponential corner case for pathological programs, but it’s never a problem in practice), but with type inference in a system with polymorphism and subtyping, which is necessary to get expressiveness anywhere near Ruby.