Skip to content

Conversation

@coolreader18
Copy link
Member

All the naming is up for debate, and also the solution for using proc macros inside rustpython_vm. I think it's a pretty good system, and I don't think there are really any other crates trying to use a proc macro in their own crate and in consumer crates.

@BenLewis-Seequent
Copy link

In Rust 1.34, which will be released around mid-April, extern_crate_self will be stabilised(rust-lang/rust#57407 (comment)). This allows adding:

extern crate self as rustpython_vm;

Which solves the problem, as we can then write rustpython_vm::xxx from within the crate.

@coolreader18
Copy link
Member Author

Oh, nice. This sort of hacky solution works for now, then.

@adrian17
Copy link
Contributor

Um... what are the benefits of this?
Because I'm not a fan of going back to zelf.

@coolreader18
Copy link
Member Author

I would have kept self, but using impl PyStringRef with #[py_class] gave some weird internal rustc compiler error. We could put this on the back burner until arbitrary_self_types lands.

@OddCoincidence
Copy link
Contributor

@adrian17

Um... what are the benefits of this?

Some that I can think of are:

  • It would provide a very nice way for users to implement their own native extensions for rustpython
  • We could derive __doc__ from rust doc comments
  • It makes it a bit clearer which methods are python methods and which are rust-only helpers.

@coolreader18

We could put this on the back burner until arbitrary_self_types lands.

I think that is the right thing to do. Also, while I think this is a good thing to have (see above reasons) I'm not totally convinced that using it for builtins / rustpython internals is the right thing to do.

@adrian17
Copy link
Contributor

adrian17 commented Mar 29, 2019

Oh, I was about to write a new response, so...

I'm still not seeing benefits fully. Is it just moving type attribute definitions from init to attributes next to class and methods? I'm not sure this is worth all the extra complexity.

It would provide a very nice way for users to implement their own native extensions for rustpython

How does it make it easier?

We could derive doc from rust doc comments

I'd be cautious about putting Python-user-visible strings in the place that's normally used for implementation-level comments and docs.

It makes it a bit clearer which methods are python methods and which are rust-only helpers.

How? Before seeing your comment I was about to say that it's more confusing now; it seems like all methods are now implicitly exposed with their function name by default?

@OddCoincidence
Copy link
Contributor

OddCoincidence commented Mar 29, 2019

How does it make it easier?

Less familiarity with rustpython-specific APIs would be required. Ideally for native extensions the declarations are as close to just writing pure, idiomatic rust as possible. Having to write a bit of imperative code for each method feels like it's in opposition of that goal. PyO3 does something similar and it's been very positively received.

I'd be cautious about putting Python-user-visible strings in the place that's normally used for implementation-level comments and docs.

Implementation-level comments would use //, docs would use ///

How? Before seeing your comment I was about to say that it's more confusing now; it seems like all methods are now implicitly exposed with their function name by default?

Methods should definitely not be implicitly exposed. For example:

impl MyNativeType {
    fn helper_rust_only(&self) {
        ...
    }

    #[pymethod]
    fn method_callable_from_python(self: PyRef<Self>) {
        self.helper_rust_only();
    }
}

The benefit is that you don't have to jump all the way to the init / make_module / whatever to figure out which ones are actual python methods. (In this example, the self: PyRef<Self> also serves that purpose, but I imagine eventually we could make it so we just take &self for methods that don't actually need to consume the ref).

@adrian17
Copy link
Contributor

adrian17 commented Mar 29, 2019

Implementation-level comments would use //, docs would use ///

Alright.

Methods should definitely not be implicitly exposed. For example:

I was commenting about the current version of the code, which does it implicitly - if methods (and properties? :) ) become explicit, that'll be nice indeed.

The benefit is that you don't have to jump all the way to the init / make_module / whatever to figure out which ones are actual python methods.

I'm speaking mostly from my minor experience with CPython, where I rarely had a problem of "is this an exposed method?", and more often of "which (magic) methods does this type expose/override?". For this use case I felt like a centralized spot like this allowed for quicker lookups.

https://github.com/python/cpython/blob/2f54908afc5665937d763510b4430f10cf764641/Objects/tupleobject.c#L826-L833

(but that was just my experience, I'd be okay with annotation approach - I was just trying to explain my original negative reaction)

Anyway, with explicit #[pymethod] and allowing for self I'll definitely feel much warmer towards this :)

@coolreader18
Copy link
Member Author

Alright, I'll work on that then.

@adrian17
Copy link
Contributor

I would have kept self, but using impl PyStringRef with #[py_class] gave some weird internal rustc compiler error.

BTW, are you sure this isn't rust-lang/rust#59406 ? Me and @OddCoincidence encountered this multiple times, it's relatively easy to trigger when doing "covert X to new style args" commits. This appears to be a bug with error reporting, and can be encountered when there's a coding error (referring to PyXRef::func when func doesn't exist, but there happens to exist PyYRef::func). If this indeed is it, it may mean that you can actually make this work with PystringRef, just need to fix some bugs.

@coolreader18
Copy link
Member Author

coolreader18 commented Mar 29, 2019

Hmmm, I'll look into that.

@adrian17
Copy link
Contributor

adrian17 commented Mar 29, 2019

More specifically, for example, it occurs when I did by accident

impl PyTupleRef {
    fn tuple_count(...args...) {...}
}
...
extend_class!(context, tuple_type, {
    "count" => context.new_rustfunc(PyTupleRef::count),
}

Note the mismatch between tuple_count and count names. Rust can't find the PyTupleRef::count method, somewhere during the search it somehow stumbles upon PyListRef::count and ICEs. The solution was simple, fix the name tuple_count -> count.

@adrian17
Copy link
Contributor

Oh, you did it :)

@coolreader18
Copy link
Member Author

Okay, I think I've addressed all concerns and the current version looks much nicer and more explicit.

@coolreader18 coolreader18 marked this pull request as ready for review March 29, 2019 21:27
}
}

#[pyclass(__inside_vm, name = "str")]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It kinda bothers me that this (and the doc comment) is on the impl block instead of the struct - is there any way to make that work?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not the #[pyclass] attribute for methods, but if we made another trait PyDocItem or something then we could put the doc comment on the struct definition. I'd say it's probably not worth it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the impl block, PyO3 uses #[pymethods] and #[pyclass] goes on the struct. I think we should imitate that design.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I think maybe #[pyimpl] would be better? (since the impl block can contain more than just methods)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something like:

#[pyclass]
/// doc
struct PyString {
    value: String,
}
type PyStringRef = PyRef<PyString>;
#[pymethods]
impl PyStringRef {
    // ...
}

The problem is that the type referenced in the impl block is different from the type in the struct declaration, so it won't be able to associate easily between those two proc macros.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not quite sure on what to call "PyClassTrait" either, do you have any ideas?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer to keep rust doc strings for rust, and add python docstrings explicitly, like this:

#[pymethod(name="add", doc="Adds two things (this is a python docstring)")
// This is rust documentation
fn add(a: PyObjectRef, b: PyObjectRef) -> PyResult {
}

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The thing is, the functions probably aren't going to need Rust documentation, as they're just implementation details for the final Python API.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I agree that this is the case. Still I think it is better to explicitly state that the documentation is for the python side.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@windelbouwman I disagree.

  • Putting the docs in a #[pyclass(doc = "...")] is pretty awkward to me because you have to manually specify the newlines. Also multi-line attributes are fairly uncommon and look out of place.
  • We can use regular // comments for implementation-level comments.
  • Using /// comments also means that developers of native extensions can see the python docs from rustdoc.
  • It has precedent in the rust community - many binding generators do the same thing.

Copy link
Contributor

@adrian17 adrian17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coolreader18 could you also investigate how much #765 is compatible with this?
Seems to me like the approach in that PR wouldn't let set be converted to this (I guess we'll call it "new class style"? :) ). Would it be possible to eg add #[pymethod] to trait methods, or some other way to mark these as Python methods? Or would you recommend simply using extend_class! for these, or something else?

@coolreader18
Copy link
Member Author

I think that would be possible, and it would make code reuse like that much easier and more intuitive.

@coolreader18
Copy link
Member Author

I'll work on trait #[pymethods] in a separate PR.

@windelbouwman
Copy link
Contributor

The macro implementation is beyond my understanding. I will have to learn this.

A single list of functions for an object has its benefits. Marking a method as such, also has benefits. I'm fine with both. In general, I'm for keeping things simple at the cost of some extra lines of code.

That being said, I leave it to others to judge this PR. For me, its fine :).

@OddCoincidence OddCoincidence mentioned this pull request Mar 31, 2019
@coolreader18
Copy link
Member Author

@windelbouwman is the naming better now? I renamed _extend_class to impl_extend_class, which I feel better explains what it does, while extend_class is a little convenience method that should be called from the outside in most cases.

@coolreader18
Copy link
Member Author

@adrian17 I have trait support (probably) ready to go under coolreader18/pyimpl-on-trait with fbe37d0, but I'll probably wait on a PR until #765 or #770 are merged so that I can try it out on a real "Protocol".

@coolreader18
Copy link
Member Author

@windelbouwman @OddCoincidence Is this good to merge now?

Copy link
Contributor

@OddCoincidence OddCoincidence left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for iterating on this! I didn't look at the proc-macro implementation in too much detail but the API is what's important and I think that's in a really good place. Great job!

@coolreader18 coolreader18 merged commit a77b348 into master Apr 2, 2019
@coolreader18 coolreader18 deleted the coolreader18/py_class-proc-macro branch April 2, 2019 22:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants