| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
| | |
|
|/
|
|
|
|
|
|
| |
There's no point transforming annotations that come from
classfiles. It's inefficient to do so and it's also risky
because it means we'd have to make sense of Scala-2 generated trees.
This should avoid the error in #1222.
|
| |
|
|
|
|
|
| |
No longer needed because we are going to allow dependent method types
in extractors, and the unary requirement is kind of obvious.
|
|
|
|
|
| |
Since we now recognize more false dependencies,
we have to dealias the new dependencies accordingly.
|
|
|
|
|
|
| |
Triggered by change in TypeComparer. I guess we should think of
dropping the NotDefiendHere tests, for a long time they gave us only false
negatives.
|
|
|
|
|
| |
Now explains in detail why an possibly found unapply or
unapplySeq is ineligible.
|
|
|
|
|
|
|
|
|
| |
#1235.scala contains a case of a method type of the form
(x: T) ... x.tail.N ...
where N is an alias. We need to follow the alias to prevent
a mischaracterization that this is a dependent method type.
|
|
|
|
|
|
| |
More generally, avoid forming a type selection on a term prefix which
has a bottom class as a type. There might be other places where we have
to take similar measures. For now, this one fixes #1235.
|
|
|
|
|
|
|
|
|
|
| |
There's a trap otherwise that, when in a class inheriting
from Context (and with it Reporting) a call to println will
go to this.println and therefore might not print at all, if
the current context buffers messages. I lost a lot of time
on this on several occasions when I scratched my head why
a simple debug println would not show. Better avoid this in
the future for myself and others.
|
|\
| |
| | |
arrayConstructors: do not rewrite ofDim for arrays of value classes
|
| | |
|
|\ \ |
|
| |\ \
| | |/
| |/| |
Allow to specify per-callsite @tailrec annotation.
|
| | | |
|
| | |
| | |
| | |
| | | |
See examples in following commit.
|
| | | |
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
When an implicit argument is not found, we should in any case
assume the result type of the implicit method as the type of
the tree (after reporting an error, of course). If we don't
do that, we get implicit errors on weird positions when we try
to find an implicit argument for the same tree again. This
caused a spurious error in subtyping.scala, and also caused
an additional error at the end of EqualityStrawman1.scala.
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
The logic in typeComparer#eitherIsSubtype was flawed.
In the case of A & B <: C, if A <: C but not B <: C
we need to return with the constraint of A <: C, but
we returned with the initial constraint instead.
|
|/ /
| |
| |
| |
| |
| | |
implicit.
This was suggested in #878.
|
|\ \
| | |
| | | |
Fix strawmans
|
| | |
| | |
| | |
| | |
| | | |
Bring strawman-4 and strawman-5 to feature-parity.
Test also strawman-4.
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
This means companions will be pickled and we can drop
the special treatement in transformInfo of FirstTransform.
That method is problematic, since it enters new symbols into
a class scope. This is not allowed, since transformInfo needs
to be purely functional, side effects are not permitted
(`enteredAfter` does not work either).
The problem manifested itself when compiling colltest5 with
a requirement failure in the code of `entered` when called
from FirstTransform (trying to enter in a frozen class).
TODO: Once we use statics for LazyVals we can get rid
of the "add companion object" logic in FirstTransform
alltogether.
|
| | | |
|
| | |
| | |
| | |
| | |
| | | |
The previous version seemed to fail non-deterministaically, but after a while
I could not reproduce it anymore. Anyway, leaving the change in.
|
| | |
| | |
| | |
| | |
| | |
| | | |
Dealias TypeRefs that get applied to type arguments. Without that
precaution we get Stackoverflows in lookupRefined/betaReduce for
CollectionStrawMan5.scala.
|
| | |
| | |
| | |
| | | |
Need to drop the final `$' in both cases.
|
| | |
| | |
| | |
| | |
| | | |
Partial fix of #765. Hack to make sure unexpandedName
works for super accessor names.
|
| | |
| | |
| | |
| | |
| | | |
LambdaTraits are created on demand; we need to make sure
they exist when referred to from Tasty.
|
| | |
| | |
| | |
| | |
| | | |
1) Print RefinedTypes with their hashCode so that we can correlated with RefinedThis types
2) Fast abort of instantiate in case we have determined that it is not safe anyway
|
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
New CollectionStrawMan5, executed as runttest in two different ways:
- built with scalac, test compiled by dotty in tests/run.
- built with dotty, test compiled by dotty using separate compilation.
|
| | | |
|
| | |
| | |
| | |
| | | |
https://github.com/lampepfl/dotty/pull/1188
|
| | |
| | |
| | |
| | | |
They need to be created through their class tag.
|
| | |
| | |
| | |
| | |
| | |
| | | |
This allowed to simplify the code in both Applications and tpd.newArray.
Now, only creation of generic arrays is handled by typer.
All other arrays are handled in ArrayConstructors phase.
|
| | |
| | |
| | |
| | |
| | | |
It's needed in order to create calls to newGenricArray
as it needs to infer the ClassTag.
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | | |
The problem comes from JavaArrayTypes.
They are invalid before erasure, and cannot be pickled,
while Array[T] is invalid after erasure and should be erased.
|
| | |
| | |
| | |
| | | |
That knows that there exists only single magical array method.
|
| | |
| | |
| | |
| | | |
It's done in a separate ArrayConstructors phase now.
|
| | |
| | |
| | |
| | |
| | | |
This one is able to encode creation of array of any type and any dimension.
Note, it does not handle value classes.
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Previously, the method `Arrays.newRefArray` was one of the only 3
methods that are kept generic after erasure. This commit removes
this magic, by making it take an actual `j.l.Class[T]` as
parameter.
Moreover, the methods `newXArray` all receive an actual body,
implemented on top of Java reflection, which means that a back-end
does not *have to* special-case those methods for correctness.
It might still be required for performance, though, depending on
the back-end.
The JVM back-end is made non-optimal in this commit, precisely
because it does not specialize that method anymore. Doing so
requires modifying the fork of scalac that we use, which should
be done separately.
The JS back-end is adapted simply by doing nothing at all on any
of the newXArray methods. It will normally call the user-space
implementations which use reflection. The Scala.js optimizer will
inline and intrinsify the reflective calls, producing optimal
code, at the end of the day.
|
| |/
|/| |
|
|\ \
| | |
| | | |
Ycheck that all methods have method type
|
| |/ |
|
|\ \
| | |
| | | |
Docs and polishing for denotation insertions
|
| | | |
|
| | | |
|