- Notifications
You must be signed in to change notification settings - Fork3.1k
Support Java 17sealed in Java sources#10348
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Uh oh!
There was an error while loading.Please reload this page.
Conversation
SethTisue commentedMar 29, 2023
f2a167c to8751aebCompareUh oh!
There was an error while loading.Please reload this page.
8751aeb toeeac512Comparesom-snytt commentedApr 1, 2023 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
The updated commit just "registers" Java classes and matches for completion after typer, where (per lrytz) either there is a permits or we look at classes from the same source file. The simple impl could be improved or complexified further. Or simplified. |
eeac512 to35c0384Comparelrytz commentedApr 3, 2023
Thanks, great to see that we can make it work! Why do you think we should delay until after typer? Forcing is what the typer does, I don't see a particular risk of running into cycles here, we don't need to infer anything to complete these class symbols. For finding symbols defined in a particular compilation unit, I also think we need new global state, there doesn't seem to be anything we can reuse. A question is where to put it, I guess it doesn't matter too much. Here's a draft, what do you think (last commit)?2.13.x...lrytz:scala:pr10348 |
som-snytt commentedApr 3, 2023 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
Yes, that is what I was thinking of (a perRunCaches Map). I thought keeping the logic in one place was simpler, "children we know about" and "other offspring". Oh I see what you mean, we've already seen all the classes for this unit, nothing is gained by waiting. The other piece I started looking at was emitting the permittedSubclasses attribute for Scala sealed. I'll try to add a commit for that, I was about to imitate some structure for innerClasses. A final clean-up is also to review the tests for duplication or completeness. Also, this is a patmat feature, so I like that it looks that way by putting the unwieldy sources map there. |
som-snytt commentedApr 3, 2023 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
I added your commits and will push after cleaning up my detritus. Also I haven't touched the java parser to look more like your earlier edit, which I remember preferring. |
35c0384 toa646610Compare| valpluginsTp= pluginsTypeSig(res, typer, cdef,WildcardType) | ||
| cdef.getAndRemoveAttachment[PermittedSubclasses].foreach { permitted=> | ||
| clazz.updateAttachment[PermittedSubclassSymbols] { | ||
| PermittedSubclassSymbols(permitted.permits.map(typer.typed(_,Mode.NOmode).symbol)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
what happens to errors on bad permits clause here?
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
lrytz left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
We forgot about inner classes...
package p;public sealed interface X { public default int x() { return 1; }}final class Y implements X { }final class O { final static class Z implements X { }}package pclass T { def n(a: X) = a match { case _: Y => 0 }}Should warn but doesn't in mixed compilation. Namer eagerly greates symbols for top-level classes. But symbols for inner classes are only created when the outer class completes.
So if there's a sealed Java type without apermits clause, we probably have to recursively force all classes defined in the same compilation unit.
X.java:9: error: anonymous classes must not extend sealed classes X bar = new X() { }; ^😅 OK, at least that
@retronym brought runtime reflection to the table:
scala> import scala.reflect.runtime.{universe => ru}import scala.reflect.runtime.{universe=>ru}scala> ru.typeTag[p.X].tpe.typeSymbol.asClass.knownDirectSubclassesval res0: Set[reflect.runtime.universe.Symbol] = Set()Also macros / compiler plugins might want to see children, not only the exhaustivity checker. But I think it's fine to defer that.
Uh oh!
There was an error while loading.Please reload this page.
sealed in Java sources [ci: last-only]2bf8131 to62c5acfComparesealed in Java sources [ci: last-only]sealed in Java sourcessom-snytt commentedApr 26, 2023
@lrytz this will be green after squashing. I guess "zucchini", as a green squash. I'll take another look at the tweak to populating the sealed class hierarchy. It needs tests with |
Required for checking permitted subclassesand for exhaustiveness of patterns.Co-authored-by: Lukas Rytz <lukas.rytz@gmail.com>
9c3e87d toa0f546dCompare
lrytz left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
🥒 ed
LGTM, thank you!
### What changes were proposed in this pull request?This PR aims to upgrade Scala to 2.13.11-https://www.scala-lang.org/news/2.13.11Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced byscala/scala#10083, we must fix it when we upgrading to use Scala 3,### Why are the changes needed?This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:-scala/scala#10363-scala/scala#10184-scala/scala#10397-scala/scala#10348-scala/scala#10105There are 2 known issues in this version:-scala/bug#12800-scala/bug#12799For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is justhttps://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselvesThe full release notes as follows:-https://github.com/scala/scala/releases/tag/v2.13.11### Does this PR introduce _any_ user-facing change?Yes, this is a Scala version change.### How was this patch tested?- Existing Test- Checked Java 8/17 + Scala 2.13.11 using GA, all test passedJava 8 + Scala 2.13.11:https://github.com/LuciferYang/spark/runs/14337279564Java 17 + Scala 2.13.11:https://github.com/LuciferYang/spark/runs/14343012195Closes#41626 from LuciferYang/SPARK-40497.Authored-by: yangjie01 <yangjie01@baidu.com>Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request?This PR aims to upgrade Scala to 2.13.11-https://www.scala-lang.org/news/2.13.11Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced byscala/scala#10083, we must fix it when we upgrading to use Scala 3,### Why are the changes needed?This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:-scala/scala#10363-scala/scala#10184-scala/scala#10397-scala/scala#10348-scala/scala#10105There are 2 known issues in this version:-scala/bug#12800-scala/bug#12799For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is justhttps://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselvesThe full release notes as follows:-https://github.com/scala/scala/releases/tag/v2.13.11### Does this PR introduce _any_ user-facing change?Yes, this is a Scala version change.### How was this patch tested?- Existing Test- Checked Java 8/17 + Scala 2.13.11 using GA, all test passedJava 8 + Scala 2.13.11:https://github.com/LuciferYang/spark/runs/14337279564Java 17 + Scala 2.13.11:https://github.com/LuciferYang/spark/runs/14343012195Closesapache#41626 from LuciferYang/SPARK-40497.Authored-by: yangjie01 <yangjie01@baidu.com>Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request?This PR aims to re-upgrade Scala to 2.13.11, afterSPARK-45144 was merged, the build issues mentioned in#41943 should no longer exist.-https://www.scala-lang.org/news/2.13.11Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced byscala/scala#10083, we must fix it when we upgrading to use Scala 3### Why are the changes needed?This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:-scala/scala#10363-scala/scala#10184-scala/scala#10397-scala/scala#10348-scala/scala#10105There are 2 known issues in this version:-scala/bug#12800-scala/bug#12799For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is justhttps://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselvesThe full release notes as follows:-https://github.com/scala/scala/releases/tag/v2.13.11### Does this PR introduce _any_ user-facing change?Yes, this is a Scala version change.### How was this patch tested?- Existing Test### Was this patch authored or co-authored using generative AI tooling?NoCloses#42918 from LuciferYang/SPARK-40497-2.Authored-by: yangjie01 <yangjie01@baidu.com>Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Uh oh!
There was an error while loading.Please reload this page.
Support
sealedin Java sources. The related PR supports separate compilation, wherePermittedSubclassesattribute specifies sealedness; there seems to be no ticket for that, alas. This PR is Java interop gravy.Fixesscala/bug#12171
Fixesscala/bug#12159