| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
| |
As a side effect, avoid creating synthetic parameters in lambda abstract.
|
|
|
|
|
| |
The typed variant is no longer needed. This means modifiers can safely be
ignored in typed trees if we so choose.
|
|
|
|
|
|
| |
Roll its functionality into Select. Since we can always
tell whether a tree is a type or term there is no expressiveness
gained by having a separate tree node.
|
|
|
|
|
|
|
|
| |
Drop tree node class 'Pair'. It was used only in imports, where
it can easily be replaced by Thicket.
The envisaged use for generic pairs is almost sure better modelled
by a "Pair" class in Dotty's standard library.
|
|
|
|
|
|
|
|
|
| |
Now it's annotated first, annotation second.
This is in line with AnnotatedType and in line with the principle
that tree arguments should come in the order they are written. The
reason why the order was swapped before is historical - Scala2 did it
that way.
|
|
|
|
|
|
| |
Roll `sm` and `i` into one interpolator (also called `i`)
Evolve `d` to `em` interpolator (for error messages)
New interpolator `ex` with more explanations, replaces disambiguation.
|
|
|
|
|
| |
Make treatment in Scala2Unpickler and Namer the same and factor
out common functionality.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Faced with recursive dependencies through self types, we might have
to apply `normalizeToClassRefs` to a class P with a parent that is not
yet initialized (witnessed by P's parents being Nil). In that case
we should still execute forwardRefs on P, but we have to
wait in a suspension until P is initialized.
This avoids the problem raised in #1401. I am still not quite sure
why forwardRefs is needed, but it seems that asSeenFrom alone is not
enough to track the dependencies in this case.
|
| |
|
|
|
|
|
| |
Allows us to drop also the involved knownHK method. Lots of other
cleanups.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Under the new hk scheme we discovered that type parameters
are sometimes unpickled in the wrong order. The fault was always
present but the previous hk schemes were somehow lenient
enough in their subtyping rules to not discover the problem.
E.g., when reading Coder.scala, dotc believed that parameter
`A` of `TraversableOnce#BufferedCanBuildFrom` is higher-kinded
and parameter `CC` is first-order where the opposite is true.
This commit hardens the way we read type parameters in order
to make this swap impossible by design.
- Revert auto-healing in derivedAppliedType
The healing hid a real error about order of type parameters
in Scala2 unpickling which was fixed in the previous commits.
The healing caused Map.scala to fail because it is possible
that type parameters are mis-prediced to be Nil in an F-bounded
context.
- Smallish fixes to type applications
|
|
|
|
|
|
|
| |
With the change in the next commit this addition is needed
to make i859.scala compile. Previously the same effect was
achieved accidentally by `updateTypeParams`. The comment
admits that we do not really know why the functionality is needed.
|
|
|
|
|
|
|
| |
Make them each inherit from common BaseType GenericType.
That way we avoid inheriting accidentally stuff from PolyType in TypeLambda.
Also, Fix adaptation of type lambdas. Don't confuse them with PolyTypes.
|
|
|
|
| |
Add existential type elimination for HKApply
|
|
|
|
|
|
|
|
|
|
|
|
| |
- Simplify RefinedType
- Drop recursive definition of RefinedThis - this is now
taken over by RecType.
- Drop RefinedThis.
- Simplify typeParams
The logic avoiding forcing is no longer needed.
- Remove unused code and out of date comments.
|
|
|
|
|
|
|
| |
For the moment under newHK flag.
- avoid crasher in derivedTypeParams (NamedTypes don't always have symbols)
- Revise logic in type comparer for new HK scheme
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Map self-references in refinements to recursive types. This
commit does this for refinement types appearing in source.
We still have to do it for unpickled refinements.
Test apply-equiv got moved to pending because it simulates
the old higher-kinded type encoding in source, which relies
on the old representation in terms of self-referential refinement
types. The plan is not to adapt this encoding to the new
representation, but to replace it with a different encoding
that makes critical use of the added power of recursive types.
Use recursive types also when unpickling from Scala 2.x.
Add mapInfo method to Denotations.
|
|
|
|
|
| |
Treat parent like refinedInfo. Introduce isBinding convenience method
in TypeBounds.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
We had a problem where unpickling an annotation containing a
class constant had the wrong type. Unpickling was done after erasure.
The type given to the constant was an alias but aliases got
eliminated during erasure, so the constant was malformed.
Unpickling annotation contents at the same phase as unpickling
the annotation carrier solves the problem.
It seems similar problems can arise when data is unpickled
using a LocalUnpickler. So we now make sure local unpickling
runs at the latest at phase Pickler.
|
| |
|
|\
| |
| | |
First of a series of compiler design documents
|
| |
| |
| |
| |
| | |
Mode is used from a lot of low-level code, does not just reflect Typer info.
So it makes more sense top to place it in the core package.
|
|/
|
|
| |
builds
|
|
|
|
|
|
|
| |
The field keeps track of the element type. This is necessary
because JavaSeqLiteral is nonvariant and the elements might
be empty, so we cannot always compute the type from the
element types.
|
| |
|
|\
| |
| | |
Hide stack traces behind -Ydebug
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
They're not very useful for end users and some tests like
tests/neg/selfreq.scala always print these exceptions which makes it
harder to read the test logs,
Also use Thread.dumpStack() instead of creating an Exception and calling
printStackTrace() on it.
|
|/
|
|
|
|
|
|
|
| |
It seems when unpickling nsc that some module classes come
without a source module. Survive this situation rather than
crashing. i859.scala is an example.
i859 compiles with the patch, but causes a deep subtype when unpickling.
Not sure whether scalac does the same.
|
|
|
|
|
|
| |
Previously adaptIfHK was performed on every type application. This made
t3152 fail. We now do this only on demand, in isSubType. t3152 now passes
again. But the change unmasked another error, which makes Iter2 fail to compile.
|
| |
|
|
|
|
| |
Also: fix adaptArgs and LambdaTrait to make it work.
|
|
|
|
|
|
|
|
|
| |
The fact that the annotation comes first is weird, because when I write
an annotated type it's <type> @<annotation>. Also, annotated types
are like RefinedTypes in that they derive from a parent type. And in
RefinedTypes the parent comes first.
So swapping the arguments improves consistency.
|
| |
|
|
|
|
|
|
| |
Adding parents signals (via SymDenotation.fullyDefined) that
the class can now be frozen. So this should be done only after all
members are entered.
|
|
|
|
|
|
|
| |
When eta expanding a type `C` to `[vX] => C[X]` the variance `v`
is now the variance of the expected type, not the variance of the
type constructor `C`, as long as the variance of the expected type
is compatible with the variance of `C`.
|
| |
|
|
|
|
|
|
| |
1) Check that searched scope is consistent
2) Do a linear search for symbol with name, and report
if something was found that way.
|
|
|
|
| |
Was stdout, but this gets mixed up with the exception printing on stderr.
|
|
|
|
|
|
|
| |
Set info early in order to avoid cyclic reference errors.
Errors were observed when compiling
scala/Predef.scala scala/package.scala scala/collection/GenSeqLike.scala
|
|
|
|
|
| |
Since we now have two forms of (almost) everything in Definitions,
might as well profit from it.
|
|
|
|
| |
Trying to hunt down the flakey build.
|
|
|
|
|
|
|
|
|
| |
When compiling Iterator.scala it was observed that
the type parameters of BufferedCanBuildFrom appeared
inm the wrong order. This fix corrects that, making
sure that type parameters appear in the decls scope
in the same order as they are given in the epxlicitly
unpickled type parameter list.
|
|\
| |
| | |
Change allow ex in hk
|
| |
| |
| |
| |
| |
| |
| | |
Reason: An inner Scala2 class might be shadowed by a same-named class in a subtype.
In Dotty this is disallowed butin Scala 2 it is possible. For instance, math.Numeric
and math.Ordering both have an inner class "Ops". Normal TypeRef types could not
select the shadowed class in Ordering is the prefix is of type Numeric.
|
| | |
|
| | |
|
|/ |
|
| |
|
|
|
|
| |
Dotty modules do.
|