| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
|
| |
Replaces SeqLiterals by JavaSeqLiterals, because the latter's
(array-)type is preserved after erasure.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When hash-consing TypeBounds, RefinedTypes and NamedTypes, we
now check the argument types with `eq`, where before it was `==`.
This is necessary for TypeBounds, because it makes a difference
whether a TypeBound has `eq` parts (then it is a type alias) or not.
So we cannot merge type aliases with non-aliases.
The symptom of the problem was when compiling Patterns.scala twice with
the new SeqLiterals phase (next commit) enabled. On second run, we encountered
an ArrayType[>: String <: String], even if we only created an ArrayType[String].
This was a consequence of the two types being identified by uniques.
Todo: Change the system so that type aliases are recognized more robustly.
But the present change seems to be useful anyway because it speeds up
uniques hashing. I verified that the stricter condition on uniques creates less
than 1% more types than before. So the speedup of hashing looks worthwhile.
|
| |
|
|
|
|
|
|
|
|
|
| |
Avoid to create ArrayTypes after erase.
Note that the *extractor* does not recognize JavaArrayTypes as
ArrayTypes. Doing so would create an infinite loop in sigName.
Generally, we do not want to ppaper over the difference when analysing
types.
|
|\
| |
| | |
Fix TreeTransfrom ignoring SeqLiterals.
|
|/ |
|
|\
| |
| | |
Fix array creation v2
|
| |
| |
| |
| |
| |
| | |
Now: All new Array[T] methods are translated to calls of the form
dotty.Arrays.newXYZArray ...
|
|\ \
| |/
|/| |
Fix/mixins
|
| |
| |
| |
| | |
The clause got accidentally dropped in the rebase.
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The datarace happened because for method "transform" implemented
by ResolveSuper which disambiguated overridden methods.
Previously, there was a reference FirstTransform.this.transform
of type termRefWithSig to the method implemented in a super trait. Now the same
reference points to the newly implemented method.
Solved because ResolveSuper now generates symbolic references.
|
| |
| |
| |
| | |
The bug caused new version of FirstTransform to compile with errors.
|
|\ \
| |/
|/| |
Allow refinements that refine already refined types.
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Previously, a double definition errorfor `T` was produced in a case like this:
type T1 = C { T <: A }
type T2 = T1 { T <: B }
This was caused by the way T1 was treated in the refinement class
that is used to typecheck the type. Desugaring of T2 with `refinedTypeToClass`
would give
trait <refinement> extends T1 { type T <: B }
and `normalizeToClassRefs` would transform this to:
trait <refinement> extends C { type T <: A; type T <: B }
Hence the double definition. The new scheme desugars the rhs of `T2` to:
trait <refinement> extends C { this: T1 => type T <: B }
which avoids the problem.
Also, added tests that #232 (fix/boundsPropagation) indeed considers all refinements
together when comparing refined types.
|
|\
| |
| | |
Fix/bounds propagation v2
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Turns out that the last commit was a red herring. None of the hoops
it jumped though was necessary. Instead there was a bug in isRef
which caused `&` to erroneously compute T & Int as Int.
The bug was that we always approximated alias types by their high bound. But
in the present case, this leads to errors because U gets 'bounds >: Nothing <: Any',
but it was still an alias type (i.e. its Deferred flag is not set). The fix
dereferences aliases only if their info is a TypeAlias.
|
| |
| |
| |
| | |
Now detects the cycles reported by @retronym
|
| |
| |
| |
| | |
More robust cyclicity check which does not depend on source positions.
|
| |
| |
| |
| | |
to reflect last commit.
|
| |
| |
| |
| | |
Toucher checks, but only deprecated warnings instead of errors.
|
| |
| |
| |
| |
| | |
We used to approximate these by their bounds, but this is confusing.
See comment in printbounds.scala.
|
| |
| |
| |
| |
| |
| | |
Move core logic to TypeOps, only leave error reporting in Checking.
That way, we have the option of re-using the code as a simple test
elsewhere.
|
| | |
|
| |
| |
| |
| |
| | |
Need to account for the fact that some argument types may be TypeBoudns themselves.
The change makes Jason's latest example work.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The previous scheme did not propagate bounds correctly. More generally,
given a comparison
T { X <: A } <: U { X <: B }
it would errenously decompose this to
T <: U, A <: B
But we really need to check whether the total constraint for X in T { X <: A }
subsumes the total constraint for X in T { X <: B }
The new scheme propagates only if the binding in the lower type is an alias.
E.g.
T { X = A } <: Y { X <: B }
decomposes to
T { A = A } <: U, A <: B
The change uncovered another bug, where in the slow path we too a member relative to a refined type;
We need to "narrow" the type to a RefinedThis instead. (See use of "narrow" in TypeComparer).
That change uncovered a third bug concerning the underlying type of a RefinedThis. The last bug was fixed in a previous commit (84f32cd814f2e07725b6ad1f6bff23d4ee38c397).
Two tests (1048, 1843) which were pos tests for scalac but failed compling in dotc have
changed their status and location. They typecheck now, but fail later. They have been
moved to pending.
There's a lot of diagnostic code in TypeComparer to figure out the various problems. I left it in
to be able to come back to the commit in case there are more problems. The checks and diagnostics
will be removed in a subsequent commit.
|
| |
| |
| |
| |
| |
| |
| | |
Now: The underlying refined type. Was: The parent of the type.
We need the change because RefinedThis is used as a narrowed version
of the underlying refinedType (e.g. in TypeComparer rebase), and the old
scheme would lose a binding of that type.
|
| |
| |
| |
| | |
Avoids cyclic references caused by forcing info too early.
|
| |
| |
| |
| |
| | |
Avoid the crash if origin is not associated with a bound in the
current constraint.
|
| |
| |
| |
| |
| |
| |
| | |
We need to adapt type parameter bounds with an as-ssen-from to the
prefix of the type constructor.
Makes pos/boundspropagation pass.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Needed some fixes to lookup refined. The potential alias
type is now calculated by taking the member of the original
refined type, instead of by simply following the refined info.
This takes into account refinements that were defined after
the refinement type that contains the alias.
The change amde another test (transform) hit the deep subtype limit,
which is now disabled.
|
|/
|
|
|
|
|
|
|
|
| |
An example where this helps:
Previously, the private value `mnemonics` in Coder.scala was fof the form
Lambda$IP { ... } # Apply
It now simplifies to a Map[...] type.
|
|\
| |
| | |
Javaparser & ElimRepeated fixes & Annotation-fixes
|
| |
| |
| |
| | |
Packages also get a JavaDefined flag, but they shouldn't be removed by FirstTransform.
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| |
| |
| |
| | |
see annot.scala for examples
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Annotations in java could be compiled as-if array-only
annotation had <repeated> arguments constructor.
That isn't true for scala.
Also, type checking creation of single-element array
requires implicit resolution to provide ClassTag.
This makes problems while reading deferred annotation.
|
| | |
|
| |
| |
| |
| |
| |
| |
| | |
Required as gettersAndSetters ignores modifiers
in tree and uses ones in the type instead.
This means that gettersAndSetters carries over
modifiers from type to tree and this one violates postconditions.
|
| |
| |
| |
| | |
to be reused by FirstTransform
|
| |
| |
| |
| |
| |
| |
| |
| | |
transformSym explicitly checks that a field is JavaDefined and does not create a symbol for it.
Creation of a setter body looks for the symbol and fails because it does not find it.
We do not need setter bodies for Java fields because we are not generating bytecode for them.
|
| |
| |
| |
| |
| |
| | |
The dummy constructor is needed so that the real constructors see the import of the companion object.
The constructor has a parameter of type Unit so that no Java code can call it.
|