Crystal: bug: inheriting `self` twice produces `Person+ is not a class, it's a Person+`

Created on 31 Dec 2017  路  15Comments  路  Source: crystal-lang/crystal

Bug

Using https://play.crystal-lang.org/#/r/3bhz v0.24.1, the following code produces an error:

class Person
  class Current < self
  end

  class Destroyed < self
  end  
end

Error: Person+ is not a class, it's a Person+

After posting about it on StackOverflow, it was suggested that the behavior is a bug.

Additional info

If I do:

class Person
  class Current < Person
  end

  class Destroyed < Person
  end
end

Then everything works fine.

If I do:

class Person
  class Current < self
  end
end

Then everything works fine.

bug compiler

Most helpful comment

On a somewhat unrelated note, I think the coolest thing about Crystal is how _accessible_ it is. I taught myself Ruby (starting out knowing no programming at all) as part of a volunteer project to build an app for a nonprofit. Because of the great syntax and relative flexibility I was able to go from zero to Rails app in a matter of months. I had initially thought about building the app in Javascript, before deciding Ruby & Rails would be a better choice. I've since delved into Javascript and Typescript, as well as some C++, and they are nowhere _near_ as easy to pick up (affirming, for me, my original decision to pick Ruby).

By its very nature as a statically typed, compiled language, I'm not sure if Crystal can ever be _quite_ as easy as Ruby to learn (though then again, maybe Types (and everything they open up in IDEs) will make Crystal _easier_ to learn), but hands down I'm excited about Crystal because of just how approachable it is. I don't need to worry about specifying types all over the place and if I change a method I don't need to update the types. I don't need to worry about all the brackets. Reading it is easy to understand. Etc. It's just so approachable for a newbie.

I can imagine a future in which Crystal is a great language for someone to learn _first_.

All 15 comments

Probably a devirtualize call is missing somewhere (if someone wants to fix this, until I eventually fix this (sounds easy)).

As a side note, self here looks pretty cryptic. I'm now more inclined to remove self type restriction and resolution from the language.

@asterite by

remove self type restriction and resolution from the language

Do you mean: remove the ability to

class Person
  class Current < self
  end

  class Destroyed < self
  end  
end

?

Yes:

class Person
  class Current < Person
  end

  class Destroyed < Person
  end  
end

(but it's just a thought, might not happen after all)

Well I can certainly appreciate an argument to remove it from a maintainability / development standpoint -- I can't speak to such an argument. From a readability / usability standpoint however, I like the current functionality (though that may reflect my background as a Ruby developer).

I ran into the issue in the first place when I was playing around porting some of my Ruby code over to Crystal. I was creating a macro that would add Current & Destroyed classes to a class. Being able to inherit self made it easy.

macro define_states
  class Current < self
  end

  class Destroyed < self
  end
end

class Person
  define_states
end

In Ruby I accomplish this with a module, but it's even _easier_ to do in Crystal with a macro. I imagine I could add an argument to the macro to use it like define_states(Person), but using self feels more elegant to me :). And, frankly, it's meaning just seems obvious to me ("oh yeah, this class inherits from whatever class this is defined in"). But again, this could just be because of my Ruby background.

I find the Crystal version of the code to be a _lot_ more readable then the ruby version:

const_set("Current", Class.new(self) {})

Update

I suppose your change would make it impossible to do it the "Ruby way" as well

And, given that I'd be using this particular macro all over the place, needing to add in the appropriate argument would be mildly annoying.

class Person
  define_states(Person)
end

class Job
  define_states(Job)
end

class Company
  define_states(Company)
end

(and it's probably my Ruby background speaking again, but I'd instantly wonder "Why do I need to keep adding in the class to the macro?" It's obvious that I want to inherit self...)

On a somewhat unrelated note, I think the coolest thing about Crystal is how _accessible_ it is. I taught myself Ruby (starting out knowing no programming at all) as part of a volunteer project to build an app for a nonprofit. Because of the great syntax and relative flexibility I was able to go from zero to Rails app in a matter of months. I had initially thought about building the app in Javascript, before deciding Ruby & Rails would be a better choice. I've since delved into Javascript and Typescript, as well as some C++, and they are nowhere _near_ as easy to pick up (affirming, for me, my original decision to pick Ruby).

By its very nature as a statically typed, compiled language, I'm not sure if Crystal can ever be _quite_ as easy as Ruby to learn (though then again, maybe Types (and everything they open up in IDEs) will make Crystal _easier_ to learn), but hands down I'm excited about Crystal because of just how approachable it is. I don't need to worry about specifying types all over the place and if I change a method I don't need to update the types. I don't need to worry about all the brackets. Reading it is easy to understand. Etc. It's just so approachable for a newbie.

I can imagine a future in which Crystal is a great language for someone to learn _first_.

Here's a technical reason to keep the self restriction:

module Foo
  def add(other : self)
  end
end

class A
  include Foo
end

class B
  include Foo
end

A.new.add(B.new)

https://carc.in/#/r/3c7u

But we diverge from the issue...

@jhass We can get same result by using macro:

module Foo
  macro included
    def add(other : {{@type}})
    end
  end
end

class A
  include Foo
end

class B
  include Foo
end

A.new.add(B.new)

However I'm not sure it is right way.

Me neither, it seems difficult to explain and reason to newcomers

The problem is what self means. Is it equal to the type that defines it, or to the effective type that's executing the method?

For example, should this compile?

class Foo
  def foo(other : self)
  end
end

class Bar < Foo
end

class Baz < Foo
end

bar = Bar.new
baz = Baz.new
bar.foo(baz)

Maybe it needs to become more explicit, something akin to Java's wildcard type restrictions (? extends Foo, ? is self), just brainstorming...

The above is the cause for this bug. Also with macro methods. Because the method ends up in the deepest class, comparing against self doesn't work. But in this case I don't know what the solution is. There's no type that you can compare against. Removing self means the only way you can do it is by checking it at runtime (the solution we ended up with).

That's why I'd like to remove self as a type restriction: it doesn't work as one would expect.

Hmm, before this conversation I interpreted self like so

class Person
  this = self
  class Current < this
  end
end

class Foo
  this = self
  def foo(other : this)
  end
end

Now I'm wondering if the second example should be interpreted as

class Foo
  def foo(other : this)
    this = self
  end
end

?

I hadn't really thought about it before, but now I'm realizing I'm not sure if the type of Foo is equal to the type of Foo.new.

Whichever is the correct interpretation though, it might not be what someone expects the first time they view the language, but once told the right way of looking at it I think it'll be readily understandable.

Indeed, I can see arguments for either interpretation of the def foo method. On the one hand, def foo is executing in the context of the Foo class, on the other hand, the contents of methods refer to the instance calling them, so maybe a method's argument definition is part of the instance context. I feel like, so long as things are consistent, I can find either explanation understandable.

Update

Regarding @asterite's comment of

should this compile?

If self were interpreted this way:

class Foo
  this = self
  def foo(other : this)
  end
end

class Bar < Foo
end

class Baz < Foo
end

bar = Bar.new
baz = Baz.new

Then bar.foo(baz) should succeed (assuming the type of Foo equals the type of Foo.new).
But if the interpretation is

class Foo
  def foo(other : this)
    this = self
  end
end

class Bar < Foo
end

class Baz < Foo
end

bar = Bar.new
baz = Baz.new

Then bar.foo(baz) should fail.
I feel like, as long as things are consistent, this would be readily learnable / intuitive.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

jhass picture jhass  路  3Comments

lgphp picture lgphp  路  3Comments

asterite picture asterite  路  3Comments

asterite picture asterite  路  3Comments

RX14 picture RX14  路  3Comments