| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
We would like to change from a scheme where eta-expansion was
prototype driven to one where unapplied parameterized types are always
eta expanded. The reason is that we might miss some eta expansions due
to cyclic references.
run/colltest4
is an exmaple. Here, we missed an eta expansion in the type of Iterator.
The class definition is:
trait Iterable[+A] extends IterableOnce[A] with FromIterable[Iterable] {
We'd expect that the second parent would expand to
FromIterable[[X0] -> Iterable[X0]]
But we miss the expansion because at the time we complete Iterable we have not
completed FromIterable yet. In fact this happens in both the old and the new hk scheme.
But in the old scheme we did not notice the error whereas in the new scheme we
get an error in PostTyper that the type Iterable does not conform to its bound
`[X0] -> Iterable[X0]`.
With this commit, we change the scheme, so that eta-expansion depends on the
type parameters of a type itself, instead of the expected type.
We should investigate whether we can do a similar change for Scala2 classloading.
Check kinds of type parameters
Also, do not allow a hk type if the bound is a * type.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The new one only reduces straight applications of type lambdas with
definite arguments. It is called very early on appliedTo, and derivedRefinedType.
The old one, now renamed to normalizeHkApply also handles wildcard arguments
and can garbage collect general unneeded hk-refinements. It is called later, at various
places.
TODO: See what functionality of normalizeHkApply should go into betaReduce instead.
Maybe we can even drop normalizeHkApply? However: need to be careful to maintain aliases
for hk type inference.
|
|
|
|
|
| |
With this change, ski compiles (but with more errors than before).
Without it, it goes into various infinite recursions.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Map self-references in refinements to recursive types. This
commit does this for refinement types appearing in source.
We still have to do it for unpickled refinements.
Test apply-equiv got moved to pending because it simulates
the old higher-kinded type encoding in source, which relies
on the old representation in terms of self-referential refinement
types. The plan is not to adapt this encoding to the new
representation, but to replace it with a different encoding
that makes critical use of the added power of recursive types.
Use recursive types also when unpickling from Scala 2.x.
Add mapInfo method to Denotations.
|
| |
|
|\
| |
| | |
Fix #856: Handle try/catch cases as catch cases if possible.
|
| |
| |
| |
| |
| |
| |
| | |
Previously they were all lifted into a match with the came cases.
Now the first cases are handled directly by by the catch. If one
of the cases can not be handled the old scheme is applied to to it
and all subsequent cases.
|
|/
|
|
|
|
|
| |
partest adds a warning in a comment at the beginning of source files
that it copies, but this means that every line number displayed in a
stack trace is offset by 6. We can workaround this by making the warning
a single line with no newline at the end.
|
|\
| |
| | |
Multiple fixes to @static
|
| | |
|
| | |
|
| | |
|
|\ \
| | |
| | | |
Fix issue with GADT not typechecking without bind in match
|
| | | |
|
| |/
|/| |
|
|\ \
| |/
|/| |
Properly report errors when cli flags are malformed
|
| |
| |
| |
| |
| | |
Previously we returned an empty Reporter with no errors so partest
reported the test as a success.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
If directly applicable alternatives exists, do not try other
alternatives. The original motivation for this change was to reduce the
number of searches for implicit views we do since some overloaded
methods like `Int#+` are used a lot, but it turns out that this also
makes more code compile (see `overload_directly_applicable.scala` for an
example), this change does not seem to match what the specification
says (it does not define a notion of "directly applicable") but it does
match the behavior of scalac, and it seems useful in general.
|
| |
| |
| |
| |
| | |
This did not work before because we incorrectly looked for their value
in the prefix of the type instead of the type itself.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The issue is subtle: the `tp` in scope in `def ofTypeImplicits` is the
`tp` passed to the top-level `implicitScope` method, not the `tp` passed
to the recursively called `iscope`, this means that before this commit,
all intermediate `OfTypeImplicit` scopes cached while computing an
implicit scope had their `tp` field incorrectly set, which means that we
could miss implicits in later implicit searches.
Note that the `implicit_cache.scala` test worked before this commit
because of the restrictions on caching that exist since
b8b0f381ef2cbcb7bad66fd3e7a9ae929baa45f6, it is included anyway because
our caching strategy might change in the future.
|
|/ |
|
|\
| |
| | |
Fixes to lambdalift that prevent memory leaks.
|
| |
| |
| |
| | |
See t5375.scala for details.
|
| | |
|
| | |
|
| |
| |
| |
| |
| |
| |
| | |
One drawback with this approach is that the type seems to propagate.
I.e. if the return type of an expression is `repeated` then the
enclosing variable will get the `repeated` type instead of getting the
expected `Seq` type
|
| | |
|
| |
| |
| |
| |
| |
| |
| | |
The tests `i1059.scala` and `t3480.scala` are failing due to a bug
in pattern matcher that evaluates the `x` in `List(x: _*)` incorrectly.
Concerned issue: #1276
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
To make tests pass, this required a looser specification of
`assumedCanEquals`, so that an abstract type T can be compared to
arbitrary values, as long as its upper bound can be compared. E.g.
T == null
T == "abc"
|
| | |
|
| |
| |
| |
| | |
(and add it to commit set).
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Also, check that pattern matching against idents/selects/literals makes
sense.
The hooks perform an implicit search for an instance of `Eq[L, R]`, where
`L`, `R` are the argument types. So far this always succeeeds because Eq.eqAny
matches all such types. A separate commit will check the returned
search term for validity.
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Compare selected contravariant arguments as if they were covariant.
Which ones is explained in the doc comment for method `isAsSpecificValueType`
in Applications.scala.
This has the same motivation than what @paulp proposed around 2012. The solution is a bit
different from the one proposed then because it only affects top-level parameters.
|
| |
| |
| |
| | |
Unrelated to other commits but useful to get in.
|
| |
| |
| |
| | |
Real test is in neg/customargs
|
| |
| |
| |
| |
| | |
Used to throw an uncaught merge error in checkAllOverrides
when compiling i1240c.scala.
|
| | |
|
| |
| |
| |
| |
| | |
This showcases a tricky interaction between overloading and overriding.
See discussion of #1240 for context.
|
| |
| |
| |
| |
| |
| |
| |
| | |
When finding two symbols in the same class that have the same signature
as seen from some prefix, issue a merge error.
This is simpler and more robust than the alternative of producing an overloaded
denotation and dealing with it afterwards.
|
| | |
|
|/ |
|
|
|
|
|
| |
Since we decided to go with the non dotty-scanner approach these are
unnecessary to have altered, might just as well revert them.
|
| |
|
|
|
|
|
| |
No longer needed because we are going to allow dependent method types
in extractors, and the unary requirement is kind of obvious.
|
|
|
|
|
| |
Now explains in detail why an possibly found unapply or
unapplySeq is ineligible.
|
| |
|
|\ |
|
| |\
| | |
| | | |
Allow to specify per-callsite @tailrec annotation.
|