diff options
57 files changed, 915 insertions, 221 deletions
@@ -38,7 +38,7 @@ unset verbose quiet cygwin toolcp colors saved_stty CDPATH CompilerMain=dotty.tools.dotc.Main FromTasty=dotty.tools.dotc.FromTasty -ReplMain=test.DottyRepl +ReplMain=dotty.tools.dotc.repl.Main diff --git a/docs/dotc-internals/overall-structure.md b/docs/dotc-internals/overall-structure.md new file mode 100644 index 000000000..a80c35b4c --- /dev/null +++ b/docs/dotc-internals/overall-structure.md @@ -0,0 +1,174 @@ +# Dotc's Overall Structure + +The compiler code is found in package [dotty.tools](https://github.com/lampepfl/dotty/tree/master/src/dotty/tools). It spans the +following three sub-packages: + + backend Compiler backends (currently for JVM and JS) + dotc The main compiler + io Helper modules for file access and classpath handling. + +The [dotc](https://github.com/lampepfl/dotty/tree/master/src/dotty/tools/dotc) +package contains some main classes that can be run as separate +programs. The most important one is class +[Main](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Main.scala). +`Main` inherits from +[Driver](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Driver.scala) which +contains the highest level functions for starting a compiler and processing some sources. +`Driver` in turn is based on two other high-level classes, +[Compiler](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Compiler.scala) and +[Run](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Run.scala). + +## Package Structure + +Most functionality of `dotc` is implemented in subpackages of `dotc`. Here's a list of sub-packages +and their focus. + + ast Abstract syntax trees, + config Compiler configuration, settings, platform specific definitions. + core Core data structures and operations, with specific subpackages for: + + core.classfile Reading of Java classfiles into core data structures + core.tasty Reading and writing of TASTY files to/from core data structures + core.unpickleScala2 Reading of Scala2 symbol information into core data structures + + parsing Scanner and parser + printing Pretty-printing trees, types and other data + repl The interactive REPL + reporting Reporting of error messages, warnings and other info. + rewrite Helpers for rewriting Scala 2's constructs into dotty's. + transform Miniphases and helpers for tree transformations. + typer Type-checking and other frontend phases + util General purpose utility classes and modules. + +## Contexts + +`dotc` has almost no global state (the only significant bit of global state is the name table, +which is used to hash strings into unique names). Instead, all essential bits of information that +can vary over a compiler run are collected in a +[Context](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Contexts.scala). +Most methods in `dotc` take a Context value as an implicit parameter. + +Contexts give a convenient way to customize values in some part of the +call-graph. To run, e.g. some compiler function `f` at a given +phase `phase`, we invoke `f` with an explicit context parameter, like +this + + f(/*normal args*/)(ctx.withPhase(phase)) + +This assumes that `f` is defined in the way most compiler functions are: + + def f(/*normal parameters*/)(implicit ctx: Context) ... + +Compiler code follows the convention that all implicit `Context` +parameters are named `ctx`. This is important to avoid implicit +ambiguities in the case where nested methods contain each a Context +parameters. The common name ensures then that the implicit parameters +properly shadow each other. + +Sometimes we want to make sure that implicit contexts are not captured +in closures or other long-lived objects, be it because we want to +enforce that nested methods each get their own implicit context, or +because we want to avoid a space leak in the case where a closure can +survive several compiler runs. A typical case is a completer for a +symbol representing an external class, which produces the attributes +of the symbol on demand, and which might never be invoked. In that +case we follow the convention that any context parameter is explicit, +not implicit, so we can track where it is used, and that it has a name +different from `ctx`. Commonly used is `ictx` for "initialization +context". + +With these two conventions in place, it has turned out that implicit +contexts work amazingly well as a device for dependency injection and +bulk parameterization. There is of course always the danger that +an unexpected implicit will be passed, but in practice this has not turned out to +be much of a problem. + +## Compiler Phases + +Seen from a temporal perspective, the `dotc` compiler consists of a list of phases. +The current list of phases is specified in class [Compiler](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Compiler.scala) as follows: + +```scala + def phases: List[List[Phase]] = List( + List(new FrontEnd), // Compiler frontend: scanner, parser, namer, typer + List(new PostTyper), // Additional checks and cleanups after type checking + List(new Pickler), // Generate TASTY info + List(new FirstTransform, // Some transformations to put trees into a canonical form + new CheckReentrant), // Internal use only: Check that compiled program has no data races involving global vars + List(new RefChecks, // Various checks mostly related to abstract members and overriding + new CheckStatic, // Check restrictions that apply to @static members + new ElimRepeated, // Rewrite vararg parameters and arguments + new NormalizeFlags, // Rewrite some definition flags + new ExtensionMethods, // Expand methods of value classes with extension methods + new ExpandSAMs, // Expand single abstract method closures to anonymous classes + new TailRec, // Rewrite tail recursion to loops + new LiftTry, // Put try expressions that might execute on non-empty stacks into their own methods + new ClassOf), // Expand `Predef.classOf` calls. + List(new PatternMatcher, // Compile pattern matches + new ExplicitOuter, // Add accessors to outer classes from nested ones. + new ExplicitSelf, // Make references to non-trivial self types explicit as casts + new CrossCastAnd, // Normalize selections involving intersection types. + new Splitter), // Expand selections involving union types into conditionals + List(new VCInlineMethods, // Inlines calls to value class methods + new SeqLiterals, // Express vararg arguments as arrays + new InterceptedMethods, // Special handling of `==`, `|=`, `getClass` methods + new Getters, // Replace non-private vals and vars with getter defs (fields are added later) + new ElimByName, // Expand by-name parameters and arguments + new AugmentScala2Traits, // Expand traits defined in Scala 2.11 to simulate old-style rewritings + new ResolveSuper), // Implement super accessors and add forwarders to trait methods + List(new Erasure), // Rewrite types to JVM model, erasing all type parameters, abstract types and refinements. + List(new ElimErasedValueType, // Expand erased value types to their underlying implementation types + new VCElideAllocations, // Peep-hole optimization to eliminate unnecessary value class allocations + new Mixin, // Expand trait fields and trait initializers + new LazyVals, // Expand lazy vals + new Memoize, // Add private fields to getters and setters + new LinkScala2ImplClasses, // Forward calls to the implementation classes of traits defined by Scala 2.11 + new NonLocalReturns, // Expand non-local returns + new CapturedVars, // Represent vars captured by closures as heap objects + new Constructors, // Collect initialization code in primary constructors + // Note: constructors changes decls in transformTemplate, no InfoTransformers should be added after it + new FunctionalInterfaces,// Rewrites closures to implement @specialized types of Functions. + new GetClass), // Rewrites getClass calls on primitive types. + List(new LambdaLift, // Lifts out nested functions to class scope, storing free variables in environments + // Note: in this mini-phase block scopes are incorrect. No phases that rely on scopes should be here + new ElimStaticThis, // Replace `this` references to static objects by global identifiers + new Flatten, // Lift all inner classes to package scope + new RestoreScopes), // Repair scopes rendered invalid by moving definitions in prior phases of the group + List(new ExpandPrivate, // Widen private definitions accessed from nested classes + new CollectEntryPoints, // Find classes with main methods + new LabelDefs), // Converts calls to labels to jumps + List(new GenSJSIR), // Generate .js code + List(new GenBCode) // Generate JVM bytecode + ) +``` + +Note that phases are grouped, so the `phases` method is of type +`List[List[Phase]]`. The idea is that all phases in a group are +*fused* into a single tree traversal. That way, phases can be kept +small (most phases perform a single function) without requiring an +excessive number of tree traversals (which are costly, because they +have generally bad cache locality). + +Phases fall into four categories: + + - Frontend phases: `Frontend`, `PostTyper` and `Pickler`. `FrontEnd` parses the source programs and generates + untyped abstract syntax trees, which are then typechecked and transformed into typed abstract syntax trees. + `PostTyper` performs checks and cleanups that require a fully typed program. In particular, it + + - creates super accessors representing `super` calls in traits + - creates implementations of synthetic (compiler-implemented) methods + - avoids storing parameters passed unchanged from subclass to superclass in duplicate fields. + + Finally `Pickler` serializes the typed syntax trees produced by the frontend as TASTY data structures. + + - High-level transformations: All phases from `FirstTransform` to `Erasure`. Most of these phases transform + syntax trees, expanding high-level constructs to more primitive ones. The last phase in the group, `Erasure` + translates all types into types supported directly by the JVM. To do this, it performs another type checking + pass, but using the rules of the JVM's type system instead of Scala's. + + - Low-level transformations: All phases from `ElimErasedValueType` to `LabelDefs`. These + further transform trees until they are essentially a structured version of Java bytecode. + + - Code generators: These map the transformed trees to Java classfiles or Javascript files. + + diff --git a/docs/dotc-internals/periods.md b/docs/dotc-internals/periods.md new file mode 100644 index 000000000..a616ba8a8 --- /dev/null +++ b/docs/dotc-internals/periods.md @@ -0,0 +1,94 @@ +# Dotc's concept of time + +Conceptually, the `dotc` compiler's job is to maintain views of +various artifacts associated with source code at all points in time. +But what is *time* for `dotc`? In fact, it is a combination of +compiler runs and compiler phases. + +The *hours* of the compiler's clocks are measured in compiler +[runs](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Run.scala). Every +run creates a new hour, which follows all the compiler runs (hours) that +happened before. `dotc` is designed to be used as an incremental +compiler that can support incremental builds, as well as interactions +in an IDE and a REPL. This means that new runs can occur quite +frequently. At the extreme, every keystroke in an editor or REPL can +potentially launch a new compiler run, so potentially an "hour" of +compiler time might take only a fraction of a second in real time. + +The *minutes* of the compiler's clocks are measured in phases. At every +compiler run, the compiler cycles through a number of +[phases](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Phases.scala). +The list of phases is defined in the [Compiler]object +There are currently about 60 phases per run, so the minutes/hours +analogy works out roughly. After every phase the view the compiler has +of the world changes: trees are transformed, types are gradually simplified +from Scala types to JVM types, definitions are rearranged, and so on. + +Many pieces in the information compiler are time-dependent. For +instance, a Scala symbol representing a definition has a type, but +that type will usually change as one goes from the higher-level Scala +view of things to the lower-level JVM view. There are different ways +to deal with this. Many compilers change the type of a symbol +destructively according to the "current phase". Another, more +functional approach might be to have different symbols representing +the same definition at different phases, which each symbol carrying a +different immutable type. `dotc` employs yet another scheme, which is +inspired by functional reactive programming (FRP): Symbols carry not a +single type, but a function from compiler phase to type. So the type +of a symbol is a time-indexed function, where time ranges over +compiler phases. + +Typically, the definition of a symbol or other quantity remains stable +for a number of phases. This leads us to the concept of a +[period](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Periods.scala). +Conceptually, period is an interval of some given phases in a given +compiler run. Periods are conceptually represented by three pieces of +information + + - the ID of the current run, + - the ID of the phase starting the period + - the number of phases in the period + +All three pieces of information are encoded in a value class over a 32 bit integer. +Here's the API for class `Period`: + +```scala + class Period(val code: Int) extends AnyVal { + def runId: RunId // The run identifier of this period. + def firstPhaseId: PhaseId // The first phase of this period + def lastPhaseId: PhaseId // The last phase of this period + def phaseId: PhaseId // The phase identifier of this single-phase period. + + def containsPhaseId(id: PhaseId): Boolean + def contains(that: Period): Boolean + def overlaps(that: Period): Boolean + + def & (that: Period): Period + def | (that: Period): Period + } +``` + +We can access the parts of a period using `runId`, `firstPhaseId`, +`lastPhaseId`, or using `phaseId` for periods consisting only of a +single phase. They return `RunId` or `PhaseId` values, which are +aliases of `Int`. `containsPhaseId`, `contains` and `overlaps` test +whether a period contains a phase or a period as a sub-interval, or +whether the interval overlaps with another period. Finally, `&` and +`|` produce the intersection and the union of two period intervals +(the union operation `|` takes as `runId` the `runId` of its left +operand, as periods spanning different `runId`s cannot be constructed. + +Periods are constructed using two `apply` methods: + +```scala + object Period { + + /** The single-phase period consisting of given run id and phase id */ + def apply(rid: RunId, pid: PhaseId): Period } + + /** The period consisting of given run id, and lo/hi phase ids */ + def apply(rid: RunId, loPid: PhaseId, hiPid: PhaseId): Period + } +``` + +As a sentinel value there's `Nowhere`, a period that is empty. diff --git a/src/dotty/tools/dotc/Bench.scala b/src/dotty/tools/dotc/Bench.scala index 2fc38d78c..56b6dabbe 100644 --- a/src/dotty/tools/dotc/Bench.scala +++ b/src/dotty/tools/dotc/Bench.scala @@ -8,6 +8,10 @@ package dotc import core.Contexts.Context import reporting.Reporter +/** A main class for running compiler benchmarks. Can instantiate a given + * number of compilers and run each (sequentially) a given number of times + * on the same sources. + */ object Bench extends Driver { @sharable private var numRuns = 1 diff --git a/src/dotty/tools/dotc/Compiler.scala b/src/dotty/tools/dotc/Compiler.scala index fe16243bb..fe48ac30e 100644 --- a/src/dotty/tools/dotc/Compiler.scala +++ b/src/dotty/tools/dotc/Compiler.scala @@ -7,7 +7,7 @@ import Periods._ import Symbols._ import Types._ import Scopes._ -import typer.{FrontEnd, Typer, Mode, ImportInfo, RefChecks} +import typer.{FrontEnd, Typer, ImportInfo, RefChecks} import reporting.{Reporter, ConsoleReporter} import Phases.Phase import transform._ @@ -18,6 +18,9 @@ import core.Denotations.SingleDenotation import dotty.tools.backend.jvm.{LabelDefs, GenBCode} import dotty.tools.backend.sjs.GenSJSIR +/** The central class of the dotc compiler. The job of a compiler is to create + * runs, which process given `phases` in a given `rootContext`. + */ class Compiler { /** Meta-ordering constraint: @@ -38,54 +41,55 @@ class Compiler { */ def phases: List[List[Phase]] = List( - List(new FrontEnd), - List(new PostTyper), - List(new Pickler), - List(new FirstTransform, - new CheckReentrant), - List(new RefChecks, - new CheckStatic, - new ElimRepeated, - new NormalizeFlags, - new ExtensionMethods, - new ExpandSAMs, - new TailRec, - new LiftTry, - new ClassOf), - List(new PatternMatcher, - new ExplicitOuter, - new ExplicitSelf, - new CrossCastAnd, - new Splitter), - List(new VCInlineMethods, - new SeqLiterals, - new InterceptedMethods, - new Getters, - new ElimByName, - new AugmentScala2Traits, - new ResolveSuper), - List(new Erasure), - List(new ElimErasedValueType, - new VCElideAllocations, - new Mixin, - new LazyVals, - new Memoize, - new LinkScala2ImplClasses, - new NonLocalReturns, - new CapturedVars, // capturedVars has a transformUnit: no phases should introduce local mutable vars here - new Constructors, // constructors changes decls in transformTemplate, no InfoTransformers should be added after it - new FunctionalInterfaces, - new GetClass), // getClass transformation should be applied to specialized methods - List(new LambdaLift, // in this mini-phase block scopes are incorrect. No phases that rely on scopes should be here - new ElimStaticThis, - new Flatten, - // new DropEmptyCompanions, - new RestoreScopes), - List(new ExpandPrivate, - new CollectEntryPoints, - new LabelDefs), - List(new GenSJSIR), - List(new GenBCode) + List(new FrontEnd), // Compiler frontend: scanner, parser, namer, typer + List(new PostTyper), // Additional checks and cleanups after type checking + List(new Pickler), // Generate TASTY info + List(new FirstTransform, // Some transformations to put trees into a canonical form + new CheckReentrant), // Internal use only: Check that compiled program has no data races involving global vars + List(new RefChecks, // Various checks mostly related to abstract members and overriding + new CheckStatic, // Check restrictions that apply to @static members + new ElimRepeated, // Rewrite vararg parameters and arguments + new NormalizeFlags, // Rewrite some definition flags + new ExtensionMethods, // Expand methods of value classes with extension methods + new ExpandSAMs, // Expand single abstract method closures to anonymous classes + new TailRec, // Rewrite tail recursion to loops + new LiftTry, // Put try expressions that might execute on non-empty stacks into their own methods + new ClassOf), // Expand `Predef.classOf` calls. + List(new PatternMatcher, // Compile pattern matches + new ExplicitOuter, // Add accessors to outer classes from nested ones. + new ExplicitSelf, // Make references to non-trivial self types explicit as casts + new CrossCastAnd, // Normalize selections involving intersection types. + new Splitter), // Expand selections involving union types into conditionals + List(new VCInlineMethods, // Inlines calls to value class methods + new SeqLiterals, // Express vararg arguments as arrays + new InterceptedMethods, // Special handling of `==`, `|=`, `getClass` methods + new Getters, // Replace non-private vals and vars with getter defs (fields are added later) + new ElimByName, // Expand by-name parameters and arguments + new AugmentScala2Traits, // Expand traits defined in Scala 2.11 to simulate old-style rewritings + new ResolveSuper), // Implement super accessors and add forwarders to trait methods + List(new Erasure), // Rewrite types to JVM model, erasing all type parameters, abstract types and refinements. + List(new ElimErasedValueType, // Expand erased value types to their underlying implmementation types + new VCElideAllocations, // Peep-hole optimization to eliminate unnecessary value class allocations + new Mixin, // Expand trait fields and trait initializers + new LazyVals, // Expand lazy vals + new Memoize, // Add private fields to getters and setters + new LinkScala2ImplClasses, // Forward calls to the implementation classes of traits defined by Scala 2.11 + new NonLocalReturns, // Expand non-local returns + new CapturedVars, // Represent vars captured by closures as heap objects + new Constructors, // Collect initialization code in primary constructors + // Note: constructors changes decls in transformTemplate, no InfoTransformers should be added after it + new FunctionalInterfaces,// Rewrites closures to implement @specialized types of Functions. + new GetClass), // Rewrites getClass calls on primitive types. + List(new LambdaLift, // Lifts out nested functions to class scope, storing free variables in environments + // Note: in this mini-phase block scopes are incorrect. No phases that rely on scopes should be here + new ElimStaticThis, // Replace `this` references to static objects by global identifiers + new Flatten, // Lift all inner classes to package scope + new RestoreScopes), // Repair scopes rendered invalid by moving definitions in prior phases of the group + List(new ExpandPrivate, // Widen private definitions accessed from nested classes + new CollectEntryPoints, // Find classes with main methods + new LabelDefs), // Converts calls to labels to jumps + List(new GenSJSIR), // Generate .js code + List(new GenBCode) // Generate JVM bytecode ) var runId = 1 diff --git a/src/dotty/tools/dotc/Driver.scala b/src/dotty/tools/dotc/Driver.scala index 887274fa8..2e78854c1 100644 --- a/src/dotty/tools/dotc/Driver.scala +++ b/src/dotty/tools/dotc/Driver.scala @@ -15,8 +15,6 @@ import scala.util.control.NonFatal */ abstract class Driver extends DotClass { - val prompt = "\ndotc> " - protected def newCompiler(implicit ctx: Context): Compiler protected def emptyReporter: Reporter = new StoreReporter(null) diff --git a/src/dotty/tools/dotc/Main.scala b/src/dotty/tools/dotc/Main.scala index 6c473d8bb..a6844fbbc 100644 --- a/src/dotty/tools/dotc/Main.scala +++ b/src/dotty/tools/dotc/Main.scala @@ -3,8 +3,7 @@ package dotc import core.Contexts.Context -/* To do: - */ +/** Main class of the `dotc` batch compiler. */ object Main extends Driver { override def newCompiler(implicit ctx: Context): Compiler = new Compiler } diff --git a/src/dotty/tools/dotc/Resident.scala b/src/dotty/tools/dotc/Resident.scala index 18bb2ff4f..56f6684d0 100644 --- a/src/dotty/tools/dotc/Resident.scala +++ b/src/dotty/tools/dotc/Resident.scala @@ -6,7 +6,9 @@ import reporting.Reporter import java.io.EOFException import scala.annotation.tailrec -/** A compiler which stays resident between runs. +/** A compiler which stays resident between runs. This is more of a PoC than + * something that's expected to be used often + * * Usage: * * > scala dotty.tools.dotc.Resident <options> <initial files> @@ -31,6 +33,7 @@ class Resident extends Driver { private val quit = ":q" private val reset = ":reset" + private val prompt = "dotc> " private def getLine() = { Console.print(prompt) diff --git a/src/dotty/tools/dotc/Run.scala b/src/dotty/tools/dotc/Run.scala index ee808323a..7a0e555e4 100644 --- a/src/dotty/tools/dotc/Run.scala +++ b/src/dotty/tools/dotc/Run.scala @@ -13,6 +13,7 @@ import java.io.{BufferedWriter, OutputStreamWriter} import scala.reflect.io.VirtualFile import scala.util.control.NonFatal +/** A compiler run. Exports various methods to compile source files */ class Run(comp: Compiler)(implicit ctx: Context) { assert(comp.phases.last.last.id <= Periods.MaxPossiblePhaseId) diff --git a/src/dotty/tools/dotc/ast/Desugar.scala b/src/dotty/tools/dotc/ast/Desugar.scala index 2ab33a120..719f3d036 100644 --- a/src/dotty/tools/dotc/ast/Desugar.scala +++ b/src/dotty/tools/dotc/ast/Desugar.scala @@ -9,7 +9,6 @@ import Decorators._ import language.higherKinds import collection.mutable.ListBuffer import config.Printers._ -import typer.Mode object desugar { diff --git a/src/dotty/tools/dotc/ast/tpd.scala b/src/dotty/tools/dotc/ast/tpd.scala index a6d97478b..8d21953ae 100644 --- a/src/dotty/tools/dotc/ast/tpd.scala +++ b/src/dotty/tools/dotc/ast/tpd.scala @@ -10,7 +10,6 @@ import util.Positions._, Types._, Contexts._, Constants._, Names._, Flags._ import SymDenotations._, Symbols._, StdNames._, Annotations._, Trees._, Symbols._ import Denotations._, Decorators._, DenotTransformers._ import config.Printers._ -import typer.Mode import collection.mutable import typer.ErrorReporting._ diff --git a/src/dotty/tools/dotc/core/Annotations.scala b/src/dotty/tools/dotc/core/Annotations.scala index 2b27b5e01..dc4897233 100644 --- a/src/dotty/tools/dotc/core/Annotations.scala +++ b/src/dotty/tools/dotc/core/Annotations.scala @@ -5,7 +5,6 @@ import Symbols._, Types._, util.Positions._, Contexts._, Constants._, ast.tpd._ import config.ScalaVersion import StdNames._ import dotty.tools.dotc.ast.{tpd, untpd} -import dotty.tools.dotc.typer.ProtoTypes.FunProtoTyped object Annotations { diff --git a/src/dotty/tools/dotc/core/Contexts.scala b/src/dotty/tools/dotc/core/Contexts.scala index 2fc958a49..a0bb03e50 100644 --- a/src/dotty/tools/dotc/core/Contexts.scala +++ b/src/dotty/tools/dotc/core/Contexts.scala @@ -18,7 +18,7 @@ import util.Positions._ import ast.Trees._ import ast.untpd import util.{FreshNameCreator, SimpleMap, SourceFile, NoSource} -import typer._ +import typer.{Implicits, ImplicitRunInfo, ImportInfo, NamerContextOps, SearchHistory, TypeAssigner, Typer} import Implicits.ContextualImplicits import config.Settings._ import config.Config @@ -544,7 +544,7 @@ object Contexts { */ def initialize()(implicit ctx: Context): Unit = { _platform = newPlatform - definitions.init + definitions.init() } def squashed(p: Phase): Phase = { diff --git a/src/dotty/tools/dotc/core/Decorators.scala b/src/dotty/tools/dotc/core/Decorators.scala index 60c019bce..7d108a459 100644 --- a/src/dotty/tools/dotc/core/Decorators.scala +++ b/src/dotty/tools/dotc/core/Decorators.scala @@ -7,7 +7,6 @@ import Contexts._, Names._, Phases._, printing.Texts._, printing.Printer, printi import util.Positions.Position, util.SourcePosition import collection.mutable.ListBuffer import dotty.tools.dotc.transform.TreeTransforms._ -import typer.Mode import scala.language.implicitConversions /** This object provides useful implicit decorators for types defined elsewhere */ diff --git a/src/dotty/tools/dotc/core/Definitions.scala b/src/dotty/tools/dotc/core/Definitions.scala index 6f8a8f837..d8c882d5c 100644 --- a/src/dotty/tools/dotc/core/Definitions.scala +++ b/src/dotty/tools/dotc/core/Definitions.scala @@ -798,7 +798,7 @@ class Definitions { private[this] var _isInitialized = false private def isInitialized = _isInitialized - def init(implicit ctx: Context) = { + def init()(implicit ctx: Context) = { this.ctx = ctx if (!_isInitialized) { // force initialization of every symbol that is synthesized or hijacked by the compiler diff --git a/src/dotty/tools/dotc/core/Denotations.scala b/src/dotty/tools/dotc/core/Denotations.scala index b52c11201..218fb8561 100644 --- a/src/dotty/tools/dotc/core/Denotations.scala +++ b/src/dotty/tools/dotc/core/Denotations.scala @@ -18,7 +18,6 @@ import printing.Texts._ import printing.Printer import io.AbstractFile import config.Config -import typer.Mode import util.common._ import collection.mutable.ListBuffer import Decorators.SymbolIteratorDecorator diff --git a/src/dotty/tools/dotc/typer/Mode.scala b/src/dotty/tools/dotc/core/Mode.scala index 55d44ad7a..5b3dbc872 100644 --- a/src/dotty/tools/dotc/typer/Mode.scala +++ b/src/dotty/tools/dotc/core/Mode.scala @@ -1,7 +1,6 @@ -package dotty.tools.dotc.typer - -import collection.mutable +package dotty.tools.dotc.core +/** A collection of mode bits that are part of a context */ case class Mode(val bits: Int) extends AnyVal { import Mode._ def | (that: Mode) = Mode(bits | that.bits) diff --git a/src/dotty/tools/dotc/core/SymDenotations.scala b/src/dotty/tools/dotc/core/SymDenotations.scala index b808335fe..78acd8f1a 100644 --- a/src/dotty/tools/dotc/core/SymDenotations.scala +++ b/src/dotty/tools/dotc/core/SymDenotations.scala @@ -13,7 +13,6 @@ import Decorators.SymbolIteratorDecorator import ast._ import annotation.tailrec import CheckRealizable._ -import typer.Mode import util.SimpleMap import util.Stats import config.Config diff --git a/src/dotty/tools/dotc/core/TypeApplications.scala b/src/dotty/tools/dotc/core/TypeApplications.scala index 17573466f..d5f44e632 100644 --- a/src/dotty/tools/dotc/core/TypeApplications.scala +++ b/src/dotty/tools/dotc/core/TypeApplications.scala @@ -12,7 +12,6 @@ import Names._ import NameOps._ import Flags._ import StdNames.tpnme -import typer.Mode import util.Positions.Position import config.Printers._ import collection.mutable diff --git a/src/dotty/tools/dotc/core/TypeComparer.scala b/src/dotty/tools/dotc/core/TypeComparer.scala index c5321572c..3010e6fc7 100644 --- a/src/dotty/tools/dotc/core/TypeComparer.scala +++ b/src/dotty/tools/dotc/core/TypeComparer.scala @@ -3,7 +3,6 @@ package dotc package core import Types._, Contexts._, Symbols._, Flags._, Names._, NameOps._, Denotations._ -import typer.Mode import Decorators._ import StdNames.{nme, tpnme} import collection.mutable @@ -238,10 +237,10 @@ class TypeComparer(initctx: Context) extends DotClass with ConstraintHandling { case OrType(tp21, tp22) => if (tp21.stripTypeVar eq tp22.stripTypeVar) isSubType(tp1, tp21) else secondTry(tp1, tp2) - case TypeErasure.ErasedValueType(cls2, underlying2) => + case TypeErasure.ErasedValueType(tycon1, underlying2) => def compareErasedValueType = tp1 match { - case TypeErasure.ErasedValueType(cls1, underlying1) => - (cls1 eq cls2) && isSameType(underlying1, underlying2) + case TypeErasure.ErasedValueType(tycon2, underlying1) => + (tycon1.symbol eq tycon2.symbol) && isSameType(underlying1, underlying2) case _ => secondTry(tp1, tp2) } diff --git a/src/dotty/tools/dotc/core/TypeErasure.scala b/src/dotty/tools/dotc/core/TypeErasure.scala index 26cac4f72..89077897a 100644 --- a/src/dotty/tools/dotc/core/TypeErasure.scala +++ b/src/dotty/tools/dotc/core/TypeErasure.scala @@ -6,7 +6,6 @@ import Symbols._, Types._, Contexts._, Flags._, Names._, StdNames._, Decorators. import Uniques.unique import dotc.transform.ExplicitOuter._ import dotc.transform.ValueClasses._ -import typer.Mode import util.DotClass /** Erased types are: @@ -67,20 +66,20 @@ object TypeErasure { * Nothing. This is because this type is only useful for type adaptation (see * [[Erasure.Boxing#adaptToType]]). * - * @param cls The value class symbol + * @param tycon A TypeRef referring to the value class symbol * @param erasedUnderlying The erased type of the single field of the value class */ - abstract case class ErasedValueType(cls: ClassSymbol, erasedUnderlying: Type) + abstract case class ErasedValueType(tycon: TypeRef, erasedUnderlying: Type) extends CachedGroundType with ValueType { - override def computeHash = doHash(cls, erasedUnderlying) + override def computeHash = doHash(tycon, erasedUnderlying) } - final class CachedErasedValueType(cls: ClassSymbol, erasedUnderlying: Type) - extends ErasedValueType(cls, erasedUnderlying) + final class CachedErasedValueType(tycon: TypeRef, erasedUnderlying: Type) + extends ErasedValueType(tycon, erasedUnderlying) object ErasedValueType { - def apply(cls: ClassSymbol, erasedUnderlying: Type)(implicit ctx: Context) = { - unique(new CachedErasedValueType(cls, erasedUnderlying)) + def apply(tycon: TypeRef, erasedUnderlying: Type)(implicit ctx: Context) = { + unique(new CachedErasedValueType(tycon, erasedUnderlying)) } } @@ -412,7 +411,7 @@ class TypeErasure(isJava: Boolean, semiEraseVCs: Boolean, isConstructor: Boolean private def eraseDerivedValueClassRef(tref: TypeRef)(implicit ctx: Context): Type = { val cls = tref.symbol.asClass val underlying = underlyingOfValueClass(cls) - if (underlying.exists) ErasedValueType(cls, valueErasure(underlying)) + if (underlying.exists) ErasedValueType(tref, valueErasure(underlying)) else NoType } diff --git a/src/dotty/tools/dotc/core/Types.scala b/src/dotty/tools/dotc/core/Types.scala index 9161ece98..fa049815a 100644 --- a/src/dotty/tools/dotc/core/Types.scala +++ b/src/dotty/tools/dotc/core/Types.scala @@ -31,7 +31,6 @@ import config.Config import config.Printers._ import annotation.tailrec import Flags.FlagSet -import typer.Mode import language.implicitConversions import scala.util.hashing.{ MurmurHash3 => hashing } @@ -1736,7 +1735,7 @@ object Types { if (symbol.exists && !candidate.symbol.exists) { // recompute from previous symbol val ownSym = symbol val newd = asMemberOf(prefix) - candidate.withDenot(asMemberOf(prefix).suchThat(_ eq ownSym)) + candidate.withDenot(newd.suchThat(_.signature == ownSym.signature)) } else candidate } @@ -3514,7 +3513,7 @@ object Types { object CyclicReference { def apply(denot: SymDenotation)(implicit ctx: Context): CyclicReference = { val ex = new CyclicReference(denot) - if (!(ctx.mode is typer.Mode.CheckCyclic)) { + if (!(ctx.mode is Mode.CheckCyclic)) { cyclicErrors.println(ex.getMessage) for (elem <- ex.getStackTrace take 200) cyclicErrors.println(elem.toString) diff --git a/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala b/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala index 25558a79a..f7a69aa53 100644 --- a/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala +++ b/src/dotty/tools/dotc/core/classfile/ClassfileParser.scala @@ -12,7 +12,6 @@ import scala.collection.{ mutable, immutable } import scala.collection.mutable.{ ListBuffer, ArrayBuffer } import scala.annotation.switch import typer.Checking.checkNonCyclic -import typer.Mode import io.AbstractFile import scala.util.control.NonFatal diff --git a/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala b/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala index eb3369184..b547862b4 100644 --- a/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala +++ b/src/dotty/tools/dotc/core/tasty/TreeUnpickler.scala @@ -13,7 +13,6 @@ import TastyUnpickler._, TastyBuffer._, PositionPickler._ import scala.annotation.{tailrec, switch} import scala.collection.mutable.ListBuffer import scala.collection.{ mutable, immutable } -import typer.Mode import config.Printers.pickling /** Unpickler for typed trees diff --git a/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala b/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala index 2831de3e0..83d427a8f 100644 --- a/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala +++ b/src/dotty/tools/dotc/core/unpickleScala2/Scala2Unpickler.scala @@ -17,7 +17,6 @@ import printing.Printer import io.AbstractFile import util.common._ import typer.Checking.checkNonCyclic -import typer.Mode import PickleBuffer._ import scala.reflect.internal.pickling.PickleFormat._ import Decorators._ @@ -180,7 +179,7 @@ class Scala2Unpickler(bytes: Array[Byte], classRoot: ClassDenotation, moduleClas val ex = new BadSignature( sm"""error reading Scala signature of $classRoot from $source: |error occurred at position $readIndex: $msg""") - if (ctx.debug) original.getOrElse(ex).printStackTrace() + if (ctx.debug || true) original.getOrElse(ex).printStackTrace() // temporarilly enable printing of original failure signature to debug failing builds throw ex } diff --git a/src/dotty/tools/dotc/printing/PlainPrinter.scala b/src/dotty/tools/dotc/printing/PlainPrinter.scala index 6d026dde7..3fb220afe 100644 --- a/src/dotty/tools/dotc/printing/PlainPrinter.scala +++ b/src/dotty/tools/dotc/printing/PlainPrinter.scala @@ -8,7 +8,6 @@ import StdNames.{nme, tpnme} import ast.Trees._, ast._ import java.lang.Integer.toOctalString import config.Config.summarizeDepth -import typer.Mode import scala.annotation.switch class PlainPrinter(_ctx: Context) extends Printer { diff --git a/src/dotty/tools/dotc/printing/RefinedPrinter.scala b/src/dotty/tools/dotc/printing/RefinedPrinter.scala index 27e42fddf..614a274b4 100644 --- a/src/dotty/tools/dotc/printing/RefinedPrinter.scala +++ b/src/dotty/tools/dotc/printing/RefinedPrinter.scala @@ -157,8 +157,8 @@ class RefinedPrinter(_ctx: Context) extends PlainPrinter(_ctx) { return toText(tp.info) case ExprType(result) => return "=> " ~ toText(result) - case ErasedValueType(clazz, underlying) => - return "ErasedValueType(" ~ toText(clazz.typeRef) ~ ", " ~ toText(underlying) ~ ")" + case ErasedValueType(tycon, underlying) => + return "ErasedValueType(" ~ toText(tycon) ~ ", " ~ toText(underlying) ~ ")" case tp: ClassInfo => return toTextParents(tp.parentsWithArgs) ~ "{...}" case JavaArrayType(elemtp) => diff --git a/src/dotty/tools/dotc/repl/CompilingInterpreter.scala b/src/dotty/tools/dotc/repl/CompilingInterpreter.scala index 7d1da1419..bc898488d 100644 --- a/src/dotty/tools/dotc/repl/CompilingInterpreter.scala +++ b/src/dotty/tools/dotc/repl/CompilingInterpreter.scala @@ -60,6 +60,8 @@ class CompilingInterpreter(out: PrintWriter, ictx: Context) extends Compiler wit import ast.untpd._ import CompilingInterpreter._ + ictx.base.initialize()(ictx) + /** directory to save .class files to */ val virtualDirectory = if (ictx.settings.d.isDefault(ictx)) new VirtualDirectory("(memory)", None) @@ -175,7 +177,7 @@ class CompilingInterpreter(out: PrintWriter, ictx: Context) extends Compiler wit // if (prevRequests.isEmpty) // new Run(this) // initialize the compiler // (not sure this is needed) // parse - parse(indentCode(line)) match { + parse(line) match { case None => Interpreter.Incomplete case Some(Nil) => Interpreter.Error // parse error or empty input case Some(tree :: Nil) if tree.isTerm && !tree.isInstanceOf[Assign] => @@ -271,7 +273,7 @@ class CompilingInterpreter(out: PrintWriter, ictx: Context) extends Compiler wit // header for the wrapper object code.println("object " + objectName + " {") code.print(importsPreamble) - code.println(indentCode(toCompute)) + code.println(toCompute) handlers.foreach(_.extraCodeToEvaluate(this,code)) code.println(importsTrailer) //end the wrapper object @@ -477,7 +479,7 @@ class CompilingInterpreter(out: PrintWriter, ictx: Context) extends Compiler wit addWrapper() if (handler.statement.isInstanceOf[Import]) - preamble.append(handler.statement.toString + ";\n") + preamble.append(handler.statement.show + ";\n") // give wildcard imports a import wrapper all to their own if (handler.importsWildcard) @@ -645,7 +647,7 @@ class CompilingInterpreter(out: PrintWriter, ictx: Context) extends Compiler wit private class ImportHandler(imp: Import) extends StatementHandler(imp) { override def resultExtractionCode(req: Request, code: PrintWriter): Unit = { - code.println("+ \"" + imp.toString + "\\n\"") + code.println("+ \"" + imp.show + "\\n\"") } def isWildcardSelector(tree: Tree) = tree match { @@ -734,20 +736,6 @@ class CompilingInterpreter(out: PrintWriter, ictx: Context) extends Compiler wit /** Clean up a string for output */ private def clean(str: String)(implicit ctx: Context) = truncPrintString(stripWrapperGunk(str)) - - /** Indent some code by the width of the scala> prompt. - * This way, compiler error messages read better. - */ - def indentCode(code: String) = { - val spaces = " " - - stringFrom(str => - for (line <- code.lines) { - str.print(spaces) - str.print(line + "\n") - str.flush() - }) - } } /** Utility methods for the Interpreter. */ diff --git a/src/dotty/tools/dotc/repl/InteractiveReader.scala b/src/dotty/tools/dotc/repl/InteractiveReader.scala index 96c55ebd0..29ecd3c9d 100644 --- a/src/dotty/tools/dotc/repl/InteractiveReader.scala +++ b/src/dotty/tools/dotc/repl/InteractiveReader.scala @@ -8,24 +8,19 @@ trait InteractiveReader { val interactive: Boolean } -/** TODO Enable jline support. - * The current Scala REPL know how to do this flexibly. +/** The current Scala REPL know how to do this flexibly. */ object InteractiveReader { /** Create an interactive reader. Uses JLine if the * library is available, but otherwise uses a * SimpleReader. */ - def createDefault(): InteractiveReader = new SimpleReader() - /* - { + def createDefault(): InteractiveReader = { try { - new JLineReader + new JLineReader() } catch { case e => //out.println("jline is not available: " + e) //debug new SimpleReader() } } -*/ - } diff --git a/src/dotty/tools/dotc/repl/InterpreterLoop.scala b/src/dotty/tools/dotc/repl/InterpreterLoop.scala index eedec3c82..4ac9602e7 100644 --- a/src/dotty/tools/dotc/repl/InterpreterLoop.scala +++ b/src/dotty/tools/dotc/repl/InterpreterLoop.scala @@ -21,10 +21,10 @@ import scala.concurrent.ExecutionContext.Implicits.global * @author Lex Spoon * @author Martin Odersky */ -class InterpreterLoop( - compiler: Compiler, - private var in: InteractiveReader, - out: PrintWriter)(implicit ctx: Context) { +class InterpreterLoop(compiler: Compiler, config: REPL.Config)(implicit ctx: Context) { + import config._ + + private var in = input val interpreter = compiler.asInstanceOf[Interpreter] @@ -52,24 +52,20 @@ class InterpreterLoop( /** print a friendly help message */ def printHelp(): Unit = { printWelcome() - out.println("Type :load followed by a filename to load a Scala file.") - out.println("Type :replay to reset execution and replay all previous commands.") - out.println("Type :quit to exit the interpreter.") + output.println("Type :load followed by a filename to load a Scala file.") + output.println("Type :replay to reset execution and replay all previous commands.") + output.println("Type :quit to exit the interpreter.") } /** Print a welcome message */ def printWelcome(): Unit = { - out.println(s"Welcome to Scala$version " + " (" + + output.println(s"Welcome to Scala$version " + " (" + System.getProperty("java.vm.name") + ", Java " + System.getProperty("java.version") + ")." ) - out.println("Type in expressions to have them evaluated.") - out.println("Type :help for more information.") - out.flush() + output.println("Type in expressions to have them evaluated.") + output.println("Type :help for more information.") + output.flush() } - /** Prompt to print when awaiting input */ - val prompt = "scala> " - val continuationPrompt = " | " - val version = ".next (pre-alpha)" /** The first interpreted command always takes a couple of seconds @@ -92,7 +88,7 @@ class InterpreterLoop( val (keepGoing, finalLineOpt) = command(line) if (keepGoing) { finalLineOpt.foreach(addReplay) - out.flush() + output.flush() repl() } } @@ -103,16 +99,16 @@ class InterpreterLoop( new FileReader(filename) } catch { case _: IOException => - out.println("Error opening file: " + filename) + output.println("Error opening file: " + filename) return } val oldIn = in val oldReplay = replayCommandsRev try { val inFile = new BufferedReader(fileIn) - in = new SimpleReader(inFile, out, false) - out.println("Loading " + filename + "...") - out.flush + in = new SimpleReader(inFile, output, false) + output.println("Loading " + filename + "...") + output.flush repl() } finally { in = oldIn @@ -124,10 +120,10 @@ class InterpreterLoop( /** create a new interpreter and replay all commands so far */ def replay(): Unit = { for (cmd <- replayCommands) { - out.println("Replaying: " + cmd) - out.flush() // because maybe cmd will have its own output + output.println("Replaying: " + cmd) + output.flush() // because maybe cmd will have its own output command(cmd) - out.println + output.println } } @@ -138,12 +134,12 @@ class InterpreterLoop( def withFile(command: String)(action: String => Unit): Unit = { val spaceIdx = command.indexOf(' ') if (spaceIdx <= 0) { - out.println("That command requires a filename to be specified.") + output.println("That command requires a filename to be specified.") return } val filename = command.substring(spaceIdx).trim if (!new File(filename).exists) { - out.println("That file does not exist") + output.println("That file does not exist") return } action(filename) @@ -169,7 +165,7 @@ class InterpreterLoop( else if (line matches replayRegexp) replay() else if (line startsWith ":") - out.println("Unknown command. Type :help for help.") + output.println("Unknown command. Type :help for help.") else shouldReplay = interpretStartingWith(line) @@ -188,7 +184,7 @@ class InterpreterLoop( case Interpreter.Error => None case Interpreter.Incomplete => if (in.interactive && code.endsWith("\n\n")) { - out.println("You typed two blank lines. Starting a new command.") + output.println("You typed two blank lines. Starting a new command.") None } else { val nextLine = in.readLine(continuationPrompt) @@ -207,7 +203,7 @@ class InterpreterLoop( val cmd = ":load " + filename command(cmd) replayCommandsRev = cmd :: replayCommandsRev - out.println() + output.println() } case _ => } diff --git a/src/dotty/tools/dotc/repl/JLineReader.scala b/src/dotty/tools/dotc/repl/JLineReader.scala new file mode 100644 index 000000000..592b19df5 --- /dev/null +++ b/src/dotty/tools/dotc/repl/JLineReader.scala @@ -0,0 +1,15 @@ +package dotty.tools +package dotc +package repl + +import jline.console.ConsoleReader + +/** Adaptor for JLine + */ +class JLineReader extends InteractiveReader { + val reader = new ConsoleReader() + + val interactive = true + + def readLine(prompt: String) = reader.readLine(prompt) +} diff --git a/src/dotty/tools/dotc/repl/REPL.scala b/src/dotty/tools/dotc/repl/REPL.scala index 2d6a3c742..e5ff2d3af 100644 --- a/src/dotty/tools/dotc/repl/REPL.scala +++ b/src/dotty/tools/dotc/repl/REPL.scala @@ -23,27 +23,37 @@ import java.io.{BufferedReader, File, FileReader, PrintWriter} */ class REPL extends Driver { - /** The default input reader */ - def input(implicit ctx: Context): InteractiveReader = { - val emacsShell = System.getProperty("env.emacs", "") != "" - //println("emacsShell="+emacsShell) //debug - if (ctx.settings.Xnojline.value || emacsShell) new SimpleReader() - else InteractiveReader.createDefault() - } - - /** The defult output writer */ - def output: PrintWriter = new NewLinePrintWriter(new ConsoleWriter, true) + lazy val config = new REPL.Config override def newCompiler(implicit ctx: Context): Compiler = - new repl.CompilingInterpreter(output, ctx) + new repl.CompilingInterpreter(config.output, ctx) override def sourcesRequired = false override def doCompile(compiler: Compiler, fileNames: List[String])(implicit ctx: Context): Reporter = { if (fileNames.isEmpty) - new InterpreterLoop(compiler, input, output).run() + new InterpreterLoop(compiler, config).run() else ctx.error(s"don't now what to do with $fileNames%, %") ctx.reporter } } + +object REPL { + class Config { + val prompt = "scala> " + val continuationPrompt = " | " + val version = ".next (pre-alpha)" + + /** The default input reader */ + def input(implicit ctx: Context): InteractiveReader = { + val emacsShell = System.getProperty("env.emacs", "") != "" + //println("emacsShell="+emacsShell) //debug + if (ctx.settings.Xnojline.value || emacsShell) new SimpleReader() + else InteractiveReader.createDefault() + } + + /** The default output writer */ + def output: PrintWriter = new NewLinePrintWriter(new ConsoleWriter, true) + } +} diff --git a/src/dotty/tools/dotc/reporting/Reporter.scala b/src/dotty/tools/dotc/reporting/Reporter.scala index 8236f93ef..44defa6b1 100644 --- a/src/dotty/tools/dotc/reporting/Reporter.scala +++ b/src/dotty/tools/dotc/reporting/Reporter.scala @@ -10,7 +10,7 @@ import collection.mutable import config.Settings.Setting import config.Printers import java.lang.System.currentTimeMillis -import typer.Mode +import core.Mode import interfaces.Diagnostic.{ERROR, WARNING, INFO} object Reporter { diff --git a/src/dotty/tools/dotc/transform/ElimStaticThis.scala b/src/dotty/tools/dotc/transform/ElimStaticThis.scala index 7df29b0b0..70a610188 100644 --- a/src/dotty/tools/dotc/transform/ElimStaticThis.scala +++ b/src/dotty/tools/dotc/transform/ElimStaticThis.scala @@ -10,7 +10,7 @@ import dotty.tools.dotc.core.SymDenotations.SymDenotation import TreeTransforms.{MiniPhaseTransform, TransformerInfo} import dotty.tools.dotc.core.Types.{ThisType, TermRef} -/** Replace This references to module classes in static methods by global identifiers to the +/** Replace This references to module classes in static methods by global identifiers to the * corresponding modules. */ class ElimStaticThis extends MiniPhaseTransform { diff --git a/src/dotty/tools/dotc/transform/Erasure.scala b/src/dotty/tools/dotc/transform/Erasure.scala index 8d890902e..7acb14af4 100644 --- a/src/dotty/tools/dotc/transform/Erasure.scala +++ b/src/dotty/tools/dotc/transform/Erasure.scala @@ -25,7 +25,7 @@ import dotty.tools.dotc.core.Flags import ValueClasses._ import TypeUtils._ import ExplicitOuter._ -import typer.Mode +import core.Mode class Erasure extends Phase with DenotTransformer { thisTransformer => @@ -153,8 +153,8 @@ object Erasure extends TypeTestsCasts{ final def box(tree: Tree, target: => String = "")(implicit ctx: Context): Tree = ctx.traceIndented(i"boxing ${tree.showSummary}: ${tree.tpe} into $target") { tree.tpe.widen match { - case ErasedValueType(clazz, _) => - New(clazz.typeRef, cast(tree, underlyingOfValueClass(clazz)) :: Nil) // todo: use adaptToType? + case ErasedValueType(tycon, _) => + New(tycon, cast(tree, underlyingOfValueClass(tycon.symbol.asClass)) :: Nil) // todo: use adaptToType? case tp => val cls = tp.classSymbol if (cls eq defn.UnitClass) constant(tree, ref(defn.BoxedUnit_UNIT)) @@ -173,10 +173,10 @@ object Erasure extends TypeTestsCasts{ def unbox(tree: Tree, pt: Type)(implicit ctx: Context): Tree = ctx.traceIndented(i"unboxing ${tree.showSummary}: ${tree.tpe} as a $pt") { pt match { - case ErasedValueType(clazz, underlying) => + case ErasedValueType(tycon, underlying) => def unboxedTree(t: Tree) = - adaptToType(t, clazz.typeRef) - .select(valueClassUnbox(clazz)) + adaptToType(t, tycon) + .select(valueClassUnbox(tycon.symbol.asClass)) .appliedToNone // Null unboxing needs to be treated separately since we cannot call a method on null. @@ -185,7 +185,7 @@ object Erasure extends TypeTestsCasts{ val tree1 = if (tree.tpe isRef defn.NullClass) adaptToType(tree, underlying) - else if (!(tree.tpe <:< clazz.typeRef)) { + else if (!(tree.tpe <:< tycon)) { assert(!(tree.tpe.typeSymbol.isPrimitiveValueClass)) val nullTree = Literal(Constant(null)) val unboxedNull = adaptToType(nullTree, underlying) @@ -223,12 +223,12 @@ object Erasure extends TypeTestsCasts{ if treeElem.widen.isPrimitiveValueType && !ptElem.isPrimitiveValueType => // See SI-2386 for one example of when this might be necessary. cast(ref(defn.runtimeMethodRef(nme.toObjectArray)).appliedTo(tree), pt) - case (_, ErasedValueType(cls, _)) => - ref(u2evt(cls)).appliedTo(tree) + case (_, ErasedValueType(tycon, _)) => + ref(u2evt(tycon.symbol.asClass)).appliedTo(tree) case _ => tree.tpe.widen match { - case ErasedValueType(cls, _) => - ref(evt2u(cls)).appliedTo(tree) + case ErasedValueType(tycon, _) => + ref(evt2u(tycon.symbol.asClass)).appliedTo(tree) case _ => if (pt.isPrimitiveValueType) primitiveConversion(tree, pt.classSymbol) diff --git a/src/dotty/tools/dotc/transform/ExtensionMethods.scala b/src/dotty/tools/dotc/transform/ExtensionMethods.scala index c5ab49c9c..a1d2e5c68 100644 --- a/src/dotty/tools/dotc/transform/ExtensionMethods.scala +++ b/src/dotty/tools/dotc/transform/ExtensionMethods.scala @@ -70,7 +70,7 @@ class ExtensionMethods extends MiniPhaseTransform with DenotTransformer with Ful } val underlying = valueErasure(underlyingOfValueClass(valueClass)) - val evt = ErasedValueType(valueClass, underlying) + val evt = ErasedValueType(valueClass.typeRef, underlying) val u2evtSym = ctx.newSymbol(moduleSym, nme.U2EVT, Synthetic | Method, MethodType(List(nme.x_0), List(underlying), evt)) val evt2uSym = ctx.newSymbol(moduleSym, nme.EVT2U, Synthetic | Method, diff --git a/src/dotty/tools/dotc/transform/FullParameterization.scala b/src/dotty/tools/dotc/transform/FullParameterization.scala index e9057e885..be64df384 100644 --- a/src/dotty/tools/dotc/transform/FullParameterization.scala +++ b/src/dotty/tools/dotc/transform/FullParameterization.scala @@ -12,6 +12,8 @@ import NameOps._ import ast._ import ast.Trees._ +import scala.reflect.internal.util.Collections + /** Provides methods to produce fully parameterized versions of instance methods, * where the `this` of the enclosing class is abstracted out in an extra leading * `$this` parameter and type parameters of the class become additional type @@ -86,9 +88,12 @@ trait FullParameterization { * } * * If a self type is present, $this has this self type as its type. + * * @param abstractOverClass if true, include the type parameters of the class in the method's list of type parameters. + * @param liftThisType if true, require created $this to be $this: (Foo[A] & Foo,this). + * This is needed if created member stays inside scope of Foo(as in tailrec) */ - def fullyParameterizedType(info: Type, clazz: ClassSymbol, abstractOverClass: Boolean = true)(implicit ctx: Context): Type = { + def fullyParameterizedType(info: Type, clazz: ClassSymbol, abstractOverClass: Boolean = true, liftThisType: Boolean = false)(implicit ctx: Context): Type = { val (mtparamCount, origResult) = info match { case info @ PolyType(mtnames) => (mtnames.length, info.resultType) case info: ExprType => (0, info.resultType) @@ -100,7 +105,8 @@ trait FullParameterization { /** The method result type */ def resultType(mapClassParams: Type => Type) = { val thisParamType = mapClassParams(clazz.classInfo.selfType) - MethodType(nme.SELF :: Nil, thisParamType :: Nil)(mt => + val firstArgType = if (liftThisType) thisParamType & clazz.thisType else thisParamType + MethodType(nme.SELF :: Nil, firstArgType :: Nil)(mt => mapClassParams(origResult).substThisUnlessStatic(clazz, MethodParam(mt, 0))) } @@ -217,12 +223,26 @@ trait FullParameterization { * - the `this` of the enclosing class, * - the value parameters of the original method `originalDef`. */ - def forwarder(derived: TermSymbol, originalDef: DefDef, abstractOverClass: Boolean = true)(implicit ctx: Context): Tree = - ref(derived.termRef) - .appliedToTypes(allInstanceTypeParams(originalDef, abstractOverClass).map(_.typeRef)) - .appliedTo(This(originalDef.symbol.enclosingClass.asClass)) - .appliedToArgss(originalDef.vparamss.nestedMap(vparam => ref(vparam.symbol))) - .withPos(originalDef.rhs.pos) + def forwarder(derived: TermSymbol, originalDef: DefDef, abstractOverClass: Boolean = true, liftThisType: Boolean = false)(implicit ctx: Context): Tree = { + val fun = + ref(derived.termRef) + .appliedToTypes(allInstanceTypeParams(originalDef, abstractOverClass).map(_.typeRef)) + .appliedTo(This(originalDef.symbol.enclosingClass.asClass)) + + (if (!liftThisType) + fun.appliedToArgss(originalDef.vparamss.nestedMap(vparam => ref(vparam.symbol))) + else { + // this type could have changed on forwarding. Need to insert a cast. + val args = Collections.map2(originalDef.vparamss, fun.tpe.paramTypess)((vparams, paramTypes) => + Collections.map2(vparams, paramTypes)((vparam, paramType) => { + assert(vparam.tpe <:< paramType.widen) // type should still conform to widened type + ref(vparam.symbol).ensureConforms(paramType) + }) + ) + fun.appliedToArgss(args) + + }).withPos(originalDef.rhs.pos) + } } object FullParameterization { diff --git a/src/dotty/tools/dotc/transform/GetClass.scala b/src/dotty/tools/dotc/transform/GetClass.scala index f25fd6f64..6a9a5fda2 100644 --- a/src/dotty/tools/dotc/transform/GetClass.scala +++ b/src/dotty/tools/dotc/transform/GetClass.scala @@ -20,7 +20,8 @@ class GetClass extends MiniPhaseTransform { override def phaseName: String = "getClass" - override def runsAfter: Set[Class[_ <: Phase]] = Set(classOf[Erasure]) + // getClass transformation should be applied to specialized methods + override def runsAfter: Set[Class[_ <: Phase]] = Set(classOf[Erasure], classOf[FunctionalInterfaces]) override def transformApply(tree: Apply)(implicit ctx: Context, info: TransformerInfo): Tree = { import ast.Trees._ diff --git a/src/dotty/tools/dotc/transform/LazyVals.scala b/src/dotty/tools/dotc/transform/LazyVals.scala index fc02e68cc..e42c7bae9 100644 --- a/src/dotty/tools/dotc/transform/LazyVals.scala +++ b/src/dotty/tools/dotc/transform/LazyVals.scala @@ -3,7 +3,6 @@ package transform import dotty.tools.dotc.core.Annotations.Annotation import dotty.tools.dotc.core.Phases.NeedsCompanions -import dotty.tools.dotc.typer.Mode import scala.collection.mutable import core._ diff --git a/src/dotty/tools/dotc/transform/PatternMatcher.scala b/src/dotty/tools/dotc/transform/PatternMatcher.scala index b4e32fa66..35e772cd1 100644 --- a/src/dotty/tools/dotc/transform/PatternMatcher.scala +++ b/src/dotty/tools/dotc/transform/PatternMatcher.scala @@ -21,7 +21,7 @@ import ast.Trees._ import Applications._ import TypeApplications._ import SymUtils._, core.NameOps._ -import typer.Mode +import core.Mode import dotty.tools.dotc.util.Positions.Position import dotty.tools.dotc.core.Decorators._ @@ -303,8 +303,139 @@ class PatternMatcher extends MiniPhaseTransform with DenotTransformer {thisTrans def optimizeCases(prevBinder: Symbol, cases: List[List[TreeMaker]], pt: Type): (List[List[TreeMaker]], List[Tree]) def analyzeCases(prevBinder: Symbol, cases: List[List[TreeMaker]], pt: Type, suppression: Suppression): Unit = {} - def emitSwitch(scrut: Tree, scrutSym: Symbol, cases: List[List[TreeMaker]], pt: Type, matchFailGenOverride: Option[Symbol => Tree], unchecked: Boolean): Option[Tree] = - None // todo + def emitSwitch(scrut: Tree, scrutSym: Symbol, cases: List[List[TreeMaker]], pt: Type, matchFailGenOverride: Option[Symbol => Tree], unchecked: Boolean): Option[Tree] = { + // TODO Deal with guards? + + def isSwitchableType(tpe: Type): Boolean = { + (tpe isRef defn.IntClass) || + (tpe isRef defn.ByteClass) || + (tpe isRef defn.ShortClass) || + (tpe isRef defn.CharClass) + } + + object IntEqualityTestTreeMaker { + def unapply(treeMaker: EqualityTestTreeMaker): Option[Int] = treeMaker match { + case EqualityTestTreeMaker(`scrutSym`, _, Literal(const), _) => + if (const.isIntRange) Some(const.intValue) + else None + case _ => + None + } + } + + def isSwitchCase(treeMakers: List[TreeMaker]): Boolean = treeMakers match { + // case 5 => + case List(IntEqualityTestTreeMaker(_), _: BodyTreeMaker) => + true + + // case 5 | 6 => + case List(AlternativesTreeMaker(`scrutSym`, alts, _), _: BodyTreeMaker) => + alts.forall { + case List(IntEqualityTestTreeMaker(_)) => true + case _ => false + } + + // case _ => + case List(_: BodyTreeMaker) => + true + + /* case x @ pat => + * This includes: + * case x => + * case x @ 5 => + * case x @ (5 | 6) => + */ + case (_: SubstOnlyTreeMaker) :: rest => + isSwitchCase(rest) + + case _ => + false + } + + /* (Nil, body) means that `body` is the default case + * It's a bit hacky but it simplifies manipulations. + */ + def extractSwitchCase(treeMakers: List[TreeMaker]): (List[Int], BodyTreeMaker) = treeMakers match { + // case 5 => + case List(IntEqualityTestTreeMaker(intValue), body: BodyTreeMaker) => + (List(intValue), body) + + // case 5 | 6 => + case List(AlternativesTreeMaker(_, alts, _), body: BodyTreeMaker) => + val intValues = alts.map { + case List(IntEqualityTestTreeMaker(intValue)) => intValue + } + (intValues, body) + + // case _ => + case List(body: BodyTreeMaker) => + (Nil, body) + + // case x @ pat => + case (_: SubstOnlyTreeMaker) :: rest => + /* Rebindings have been propagated, so the eventual body in `rest` + * contains all the necessary information. The substitution can be + * dropped at this point. + */ + extractSwitchCase(rest) + } + + def doOverlap(a: List[Int], b: List[Int]): Boolean = + a.exists(b.contains _) + + def makeSwitch(valuesToCases: List[(List[Int], BodyTreeMaker)]): Tree = { + def genBody(body: BodyTreeMaker): Tree = { + val valDefs = body.rebindings.emitValDefs + if (valDefs.isEmpty) body.body + else Block(valDefs, body.body) + } + + val intScrut = + if (pt isRef defn.IntClass) ref(scrutSym) + else Select(ref(scrutSym), nme.toInt) + + val (normalCases, defaultCaseAndRest) = valuesToCases.span(_._1.nonEmpty) + + val newCases = for { + (values, body) <- normalCases + } yield { + val literals = values.map(v => Literal(Constant(v))) + val pat = + if (literals.size == 1) literals.head + else Alternative(literals) + CaseDef(pat, EmptyTree, genBody(body)) + } + + val catchAllDef = { + if (defaultCaseAndRest.isEmpty) { + matchFailGenOverride.fold[Tree]( + Throw(New(defn.MatchErrorType, List(ref(scrutSym)))))( + _(scrutSym)) + } else { + /* After the default case, assuming the IR even allows anything, + * things are unreachable anyway and can be removed. + */ + genBody(defaultCaseAndRest.head._2) + } + } + val defaultCase = CaseDef(Underscore(defn.IntType), EmptyTree, catchAllDef) + + Match(intScrut, newCases :+ defaultCase) + } + + if (isSwitchableType(scrut.tpe.widenDealias) && cases.forall(isSwitchCase)) { + val valuesToCases = cases.map(extractSwitchCase) + val values = valuesToCases.map(_._1) + if (values.tails.exists { tail => tail.nonEmpty && tail.tail.exists(doOverlap(_, tail.head)) }) { + // TODO Deal with overlapping cases (mostly useless without guards) + None + } else { + Some(makeSwitch(valuesToCases)) + } + } else { + None + } + } // for catch (no need to customize match failure) def emitTypeSwitch(bindersAndCases: List[(Symbol, List[TreeMaker])], pt: Type): Option[List[CaseDef]] = diff --git a/src/dotty/tools/dotc/transform/RestoreScopes.scala b/src/dotty/tools/dotc/transform/RestoreScopes.scala index 41da05691..8b9d2be0d 100644 --- a/src/dotty/tools/dotc/transform/RestoreScopes.scala +++ b/src/dotty/tools/dotc/transform/RestoreScopes.scala @@ -11,7 +11,6 @@ import TreeTransforms.MiniPhaseTransform import SymDenotations._ import ast.Trees._ import NameOps._ -import typer.Mode import TreeTransforms.TransformerInfo import StdNames._ diff --git a/src/dotty/tools/dotc/transform/TailRec.scala b/src/dotty/tools/dotc/transform/TailRec.scala index 58fe7a6c9..23686b522 100644 --- a/src/dotty/tools/dotc/transform/TailRec.scala +++ b/src/dotty/tools/dotc/transform/TailRec.scala @@ -1,7 +1,7 @@ package dotty.tools.dotc.transform import dotty.tools.dotc.ast.Trees._ -import dotty.tools.dotc.ast.tpd +import dotty.tools.dotc.ast.{TreeTypeMap, tpd} import dotty.tools.dotc.core.Contexts.Context import dotty.tools.dotc.core.Decorators._ import dotty.tools.dotc.core.DenotTransformers.DenotTransformer @@ -10,13 +10,12 @@ import dotty.tools.dotc.core.Symbols._ import dotty.tools.dotc.core.Types._ import dotty.tools.dotc.core._ import dotty.tools.dotc.transform.TailRec._ -import dotty.tools.dotc.transform.TreeTransforms.{TransformerInfo, MiniPhaseTransform} +import dotty.tools.dotc.transform.TreeTransforms.{MiniPhaseTransform, TransformerInfo} /** * A Tail Rec Transformer - * * @author Erik Stenman, Iulian Dragos, - * ported to dotty by Dmitry Petrashko + * ported and heavily modified for dotty by Dmitry Petrashko * @version 1.1 * * What it does: @@ -77,7 +76,9 @@ class TailRec extends MiniPhaseTransform with DenotTransformer with FullParamete private def mkLabel(method: Symbol, abstractOverClass: Boolean)(implicit c: Context): TermSymbol = { val name = c.freshName(labelPrefix) - c.newSymbol(method, name.toTermName, labelFlags, fullyParameterizedType(method.info, method.enclosingClass.asClass, abstractOverClass)) + if (method.owner.isClass) + c.newSymbol(method, name.toTermName, labelFlags, fullyParameterizedType(method.info, method.enclosingClass.asClass, abstractOverClass, liftThisType = false)) + else c.newSymbol(method, name.toTermName, labelFlags, method.info) } override def transformDefDef(tree: tpd.DefDef)(implicit ctx: Context, info: TransformerInfo): tpd.Tree = { @@ -103,7 +104,7 @@ class TailRec extends MiniPhaseTransform with DenotTransformer with FullParamete // and second one will actually apply, // now this speculatively transforms tree and throws away result in many cases val rhsSemiTransformed = { - val transformer = new TailRecElimination(origMeth, owner, thisTpe, mandatory, label, abstractOverClass = defIsTopLevel) + val transformer = new TailRecElimination(origMeth, dd.tparams, owner, thisTpe, mandatory, label, abstractOverClass = defIsTopLevel) val rhs = atGroupEnd(transformer.transform(dd.rhs)(_)) rewrote = transformer.rewrote rhs @@ -111,10 +112,25 @@ class TailRec extends MiniPhaseTransform with DenotTransformer with FullParamete if (rewrote) { val dummyDefDef = cpy.DefDef(tree)(rhs = rhsSemiTransformed) - val res = fullyParameterizedDef(label, dummyDefDef, abstractOverClass = defIsTopLevel) - val call = forwarder(label, dd, abstractOverClass = defIsTopLevel) - Block(List(res), call) - } else { + if (tree.symbol.owner.isClass) { + val labelDef = fullyParameterizedDef(label, dummyDefDef, abstractOverClass = defIsTopLevel) + val call = forwarder(label, dd, abstractOverClass = defIsTopLevel, liftThisType = true) + Block(List(labelDef), call) + } else { // inner method. Tail recursion does not change `this` + val labelDef = polyDefDef(label, trefs => vrefss => { + val origMeth = tree.symbol + val origTParams = tree.tparams.map(_.symbol) + val origVParams = tree.vparamss.flatten map (_.symbol) + new TreeTypeMap( + typeMap = identity(_) + .substDealias(origTParams, trefs) + .subst(origVParams, vrefss.flatten.map(_.tpe)), + oldOwners = origMeth :: Nil, + newOwners = label :: Nil + ).transform(rhsSemiTransformed) + }) + Block(List(labelDef), ref(label).appliedToArgss(vparamss0.map(_.map(x=> ref(x.symbol))))) + }} else { if (mandatory) ctx.error("TailRec optimisation not applicable, method not tail recursive", dd.pos) dd.rhs @@ -132,7 +148,7 @@ class TailRec extends MiniPhaseTransform with DenotTransformer with FullParamete } - class TailRecElimination(method: Symbol, enclosingClass: Symbol, thisType: Type, isMandatory: Boolean, label: Symbol, abstractOverClass: Boolean) extends tpd.TreeMap { + class TailRecElimination(method: Symbol, methTparams: List[Tree], enclosingClass: Symbol, thisType: Type, isMandatory: Boolean, label: Symbol, abstractOverClass: Boolean) extends tpd.TreeMap { import dotty.tools.dotc.ast.tpd._ @@ -175,8 +191,9 @@ class TailRec extends MiniPhaseTransform with DenotTransformer with FullParamete case x => (x, x, accArgs, accT, x.symbol) } - val (reciever, call, arguments, typeArguments, symbol) = receiverArgumentsAndSymbol(tree) - val recv = noTailTransform(reciever) + val (prefix, call, arguments, typeArguments, symbol) = receiverArgumentsAndSymbol(tree) + val hasConformingTargs = (typeArguments zip methTparams).forall{x => x._1.tpe <:< x._2.tpe} + val recv = noTailTransform(prefix) val targs = typeArguments.map(noTailTransform) val argumentss = arguments.map(noTailTransforms) @@ -215,20 +232,21 @@ class TailRec extends MiniPhaseTransform with DenotTransformer with FullParamete targs ::: classTypeArgs.map(x => ref(x.typeSymbol)) } else targs - val method = Apply(if (callTargs.nonEmpty) TypeApply(Ident(label.termRef), callTargs) else Ident(label.termRef), - List(receiver)) + val method = if (callTargs.nonEmpty) TypeApply(Ident(label.termRef), callTargs) else Ident(label.termRef) + val thisPassed = if(this.method.owner.isClass) method appliedTo(receiver.ensureConforms(method.tpe.widen.firstParamTypes.head)) else method val res = - if (method.tpe.widen.isParameterless) method - else argumentss.foldLeft(method) { - (met, ar) => Apply(met, ar) // Dotty deviation no auto-detupling yet. - } + if (thisPassed.tpe.widen.isParameterless) thisPassed + else argumentss.foldLeft(thisPassed) { + (met, ar) => Apply(met, ar) // Dotty deviation no auto-detupling yet. + } res } if (isRecursiveCall) { if (ctx.tailPos) { - if (recv eq EmptyTree) rewriteTailCall(This(enclosingClass.asClass)) + if (!hasConformingTargs) fail("it changes type arguments on a polymorphic recursive call") + else if (recv eq EmptyTree) rewriteTailCall(This(enclosingClass.asClass)) else if (receiverIsSame || receiverIsThis) rewriteTailCall(recv) else fail("it changes type of 'this' on a polymorphic recursive call") } diff --git a/src/dotty/tools/dotc/transform/TreeChecker.scala b/src/dotty/tools/dotc/transform/TreeChecker.scala index a260963e9..dadaf52e2 100644 --- a/src/dotty/tools/dotc/transform/TreeChecker.scala +++ b/src/dotty/tools/dotc/transform/TreeChecker.scala @@ -15,6 +15,7 @@ import core.StdNames._ import core.Decorators._ import core.TypeErasure.isErasedType import core.Phases.Phase +import core.Mode import typer._ import typer.ErrorReporting._ import reporting.ThrowingReporter diff --git a/src/dotty/tools/dotc/transform/TreeTransform.scala b/src/dotty/tools/dotc/transform/TreeTransform.scala index 7fe003388..67bd2f160 100644 --- a/src/dotty/tools/dotc/transform/TreeTransform.scala +++ b/src/dotty/tools/dotc/transform/TreeTransform.scala @@ -11,7 +11,7 @@ import dotty.tools.dotc.core.Phases.Phase import dotty.tools.dotc.core.SymDenotations.SymDenotation import dotty.tools.dotc.core.Symbols.Symbol import dotty.tools.dotc.core.Flags.PackageVal -import dotty.tools.dotc.typer.Mode +import dotty.tools.dotc.core.Mode import dotty.tools.dotc.ast.Trees._ import dotty.tools.dotc.core.Decorators._ import dotty.tools.dotc.util.DotClass diff --git a/test/dotc/tests.scala b/test/dotc/tests.scala index 457116feb..51b8b3dc5 100644 --- a/test/dotc/tests.scala +++ b/test/dotc/tests.scala @@ -46,12 +46,16 @@ class tests extends CompilerTest { val negDir = testsDir + "neg/" val runDir = testsDir + "run/" val newDir = testsDir + "new/" + val replDir = testsDir + "repl/" val sourceDir = "./src/" val dottyDir = sourceDir + "dotty/" val toolsDir = dottyDir + "tools/" + val backendDir = toolsDir + "backend/" val dotcDir = toolsDir + "dotc/" val coreDir = dotcDir + "core/" + val parsingDir = dotcDir + "parsing/" + val dottyReplDir = dotcDir + "repl/" val typerDir = dotcDir + "typer/" @Test def pickle_pickleOK = compileDir(testsDir, "pickling", testPickling) @@ -109,6 +113,7 @@ class tests extends CompilerTest { @Test def pos_859 = compileFile(posSpecialDir, "i859", scala2mode)(allowDeepSubtypes) @Test def new_all = compileFiles(newDir, twice) + @Test def repl_all = replFiles(replDir) @Test def neg_all = compileFiles(negDir, verbose = true, compileSubDirs = false) @Test def neg_typedIdents() = compileDir(negDir, "typedIdents") @@ -191,11 +196,35 @@ class tests extends CompilerTest { @Test def java_all = compileFiles(javaDir, twice) //@Test def dotc_compilercommand = compileFile(dotcDir + "config/", "CompilerCommand") + //TASTY tests @Test def tasty_new_all = compileFiles(newDir, testPickling) + + @Test def tasty_dotty = compileDir(sourceDir, "dotty", testPickling) + @Test def tasty_annotation_internal = compileDir(s"${dottyDir}annotation/", "internal", testPickling) + @Test def tasty_runtime = compileDir(s"$dottyDir", "runtime", testPickling) + + //TODO: issues with ./src/dotty/runtime/vc/VCPrototype.scala + //@Test def tasty_runtime_vc = compileDir(s"${dottyDir}runtime/", "vc", testPickling) + + @Test def tasty_tools = compileDir(dottyDir, "tools", testPickling) + + //TODO: issue with ./src/dotty/tools/backend/jvm/DottyBackendInterface.scala + @Test def tasty_backend_jvm = compileList("tasty_backend_jvm", List( + "CollectEntryPoints.scala", "GenBCode.scala", "LabelDefs.scala", + "scalaPrimitives.scala" + ) map (s"${backendDir}jvm/" + _), testPickling) + + //TODO: issue with ./src/dotty/tools/backend/sjs/JSCodeGen.scala + @Test def tasty_backend_sjs = compileList("tasty_backend_sjs", List( + "GenSJSIR.scala", "JSDefinitions.scala", "JSEncoding.scala", "JSInterop.scala", + "JSPositions.scala", "JSPrimitives.scala", "ScopedVar.scala" + ) map (s"${backendDir}sjs/" + _), testPickling) + + @Test def tasty_dotc = compileDir(toolsDir, "dotc", testPickling) + @Test def tasty_dotc_ast = compileDir(dotcDir, "ast", testPickling) @Test def tasty_dotc_config = compileDir(dotcDir, "config", testPickling) - @Test def tasty_dotc_printing = compileDir(dotcDir, "printing", testPickling) - //@Test def tasty_dotc_reporting = compileDir(dotcDir, "reporting", testPickling) - @Test def tasty_dotc_util = compileDir(dotcDir, "util", testPickling) + + //TODO: issue with ./src/dotty/tools/dotc/core/Types.scala @Test def tasty_core = compileList("tasty_core", List( "Annotations.scala", "Constants.scala", "Constraint.scala", "ConstraintHandling.scala", "ConstraintRunInfo.scala", "Contexts.scala", "Decorators.scala", "Definitions.scala", @@ -206,15 +235,56 @@ class tests extends CompilerTest { "TypeApplications.scala", "TypeComparer.scala", "TypeErasure.scala", "TypeOps.scala", "TyperState.scala", "Uniques.scala" ) map (coreDir + _), testPickling) - @Test def tasty_typer = compileList("tasty_typer", List( - "Applications.scala", "Checking.scala", "ConstFold.scala", "ErrorReporting.scala", - "EtaExpansion.scala", "FrontEnd.scala", "Implicits.scala", "ImportInfo.scala", - "Inferencing.scala", "Mode.scala", "ProtoTypes.scala", "ReTyper.scala", "RefChecks.scala", - "TypeAssigner.scala", "Typer.scala", "VarianceChecker.scala", "Variances.scala" - ) map (typerDir + _), testPickling) - @Test def tasty_tasty = compileDir(coreDir, "tasty", testPickling) + @Test def tasty_classfile = compileDir(coreDir, "classfile", testPickling) + @Test def tasty_tasty = compileDir(coreDir, "tasty", testPickling) @Test def tasty_unpickleScala2 = compileDir(coreDir, "unpickleScala2", testPickling) + + //TODO: issue with ./src/dotty/tools/dotc/parsing/Parsers.scala + @Test def tasty_dotc_parsing = compileList("tasty_dotc_parsing", List( + "CharArrayReader.scala", "JavaParsers.scala", "JavaScanners.scala", "JavaTokens.scala", + "MarkupParserCommon.scala", "MarkupParsers.scala", "package.scala" ,"Scanners.scala", + "ScriptParsers.scala", "SymbolicXMLBuilder.scala", "Tokens.scala", "Utility.scala" + ) map (parsingDir + _), testPickling) + + @Test def tasty_dotc_printing = compileDir(dotcDir, "printing", testPickling) + + //TODO: issues with ./src/dotty/tools/dotc/repl/CompilingInterpreter.scala, + //./src/dotty/tools/dotc/repl/InterpreterLoop.scala + @Test def tasty_dotc_repl = compileList("tasty_dotc_repl", List( + "AbstractFileClassLoader.scala", "ConsoleWriter.scala", "InteractiveReader.scala", + "Interpreter.scala", "Main.scala", "NewLinePrintWriter.scala", "REPL.scala", "SimpleReader.scala" + ) map (dottyReplDir + _), testPickling) + + //@Test def tasty_dotc_reporting = compileDir(dotcDir, "reporting", testPickling) + @Test def tasty_dotc_rewrite = compileDir(dotcDir, "rewrite", testPickling) + + //TODO: issues with LazyVals.scala, PatternMatcher.scala + @Test def tasty_dotc_transform = compileList("tasty_dotc_transform", List( + "AugmentScala2Traits.scala", "CapturedVars.scala", "CheckReentrant.scala", "CheckStatic.scala", + "ClassOf.scala", "CollectEntryPoints.scala", "Constructors.scala", "CrossCastAnd.scala", + "CtxLazy.scala", "ElimByName.scala", "ElimErasedValueType.scala", "ElimRepeated.scala", + "ElimStaticThis.scala", "Erasure.scala", "ExpandPrivate.scala", "ExpandSAMs.scala", + "ExplicitOuter.scala", "ExplicitSelf.scala", "ExtensionMethods.scala", "FirstTransform.scala", + "Flatten.scala", "FullParameterization.scala", "FunctionalInterfaces.scala", "GetClass.scala", + "Getters.scala", "InterceptedMethods.scala", "LambdaLift.scala", "LiftTry.scala", "LinkScala2ImplClasses.scala", + "MacroTransform.scala", "Memoize.scala", "Mixin.scala", "MixinOps.scala", "NonLocalReturns.scala", + "NormalizeFlags.scala", "OverridingPairs.scala", "ParamForwarding.scala", "Pickler.scala", "PostTyper.scala", + "ResolveSuper.scala", "RestoreScopes.scala", "SeqLiterals.scala", "Splitter.scala", "SuperAccessors.scala", + "SymUtils.scala", "SyntheticMethods.scala", "TailRec.scala", "TreeChecker.scala", "TreeExtractors.scala", + "TreeGen.scala", "TreeTransform.scala", "TypeTestsCasts.scala", "TypeUtils.scala", "ValueClasses.scala", + "VCElideAllocations.scala", "VCInlineMethods.scala" + ) map (s"${dotcDir}transform/" + _), testPickling) + + //TODO: issue with ./src/dotty/tools/dotc/typer/Namer.scala + @Test def tasty_typer = compileList("tasty_typer", List( + "Applications.scala", "Checking.scala", "ConstFold.scala", "ErrorReporting.scala", + "EtaExpansion.scala", "FrontEnd.scala", "Implicits.scala", "ImportInfo.scala", + "Inferencing.scala", "ProtoTypes.scala", "ReTyper.scala", "RefChecks.scala", + "TypeAssigner.scala", "Typer.scala", "VarianceChecker.scala", "Variances.scala" + ) map (typerDir + _), testPickling) + + @Test def tasty_dotc_util = compileDir(dotcDir, "util", testPickling) @Test def tasty_tools_io = compileDir(toolsDir, "io", testPickling) @Test def tasty_tests = compileDir(testsDir, "tasty", testPickling) } diff --git a/test/test/CompilerTest.scala b/test/test/CompilerTest.scala index ef2f719fc..1ca836133 100644 --- a/test/test/CompilerTest.scala +++ b/test/test/CompilerTest.scala @@ -5,12 +5,12 @@ import dotty.tools.dotc.{Main, Bench, Driver} import dotty.tools.dotc.reporting.Reporter import dotty.tools.dotc.util.SourcePosition import dotty.tools.dotc.config.CompilerCommand +import dotty.tools.io.PlainFile import scala.collection.mutable.ListBuffer import scala.reflect.io.{ Path, Directory, File => SFile, AbstractFile } import scala.tools.partest.nest.{ FileManager, NestUI } import scala.annotation.tailrec import java.io.{ RandomAccessFile, File => JFile } -import dotty.tools.io.PlainFile import org.junit.Test @@ -205,7 +205,22 @@ abstract class CompilerTest { } } + def replFile(prefix: String, fileName: String): Unit = { + val path = s"$prefix$fileName" + val f = new PlainFile(path) + val repl = new TestREPL(new String(f.toCharArray)) + repl.process(Array[String]()) + repl.check() + } + def replFiles(path: String): Unit = { + val dir = Directory(path) + val fileNames = dir.files.toArray.map(_.jfile.getName).filter(_ endsWith ".check") + for (name <- fileNames) { + log(s"testing $path$name") + replFile(path, name) + } + } // ========== HELPERS ============= diff --git a/test/test/DeSugarTest.scala b/test/test/DeSugarTest.scala index 77aa293d5..1365f3222 100644 --- a/test/test/DeSugarTest.scala +++ b/test/test/DeSugarTest.scala @@ -9,7 +9,7 @@ import dotty.tools.dotc._ import ast.Trees._ import ast.desugar import ast.desugar._ -import typer.Mode +import core.Mode import Contexts.Context import scala.collection.mutable.ListBuffer diff --git a/test/test/TestREPL.scala b/test/test/TestREPL.scala new file mode 100644 index 000000000..d01038c43 --- /dev/null +++ b/test/test/TestREPL.scala @@ -0,0 +1,47 @@ +package test + +import dotty.tools.dotc.repl._ +import dotty.tools.dotc.core.Contexts.Context +import collection.mutable +import java.io.StringWriter + +/** A subclass of REPL used for testing. + * It takes a transcript of a REPL session in `script`. The transcript + * starts with the first input prompt `scala> ` and ends with `scala> :quit` and a newline. + * Invoking `process()` on the `TestREPL` runs all input lines and + * collects then interleaved with REPL output in a string writer `out`. + * Invoking `check()` checks that the collected output matches the original + * `script`. + */ +class TestREPL(script: String) extends REPL { + + private val out = new StringWriter() + + override lazy val config = new REPL.Config { + override val output = new NewLinePrintWriter(out) + + override def input(implicit ctx: Context) = new InteractiveReader { + val lines = script.lines + def readLine(prompt: String): String = { + val line = lines.next + if (line.startsWith(prompt) || line.startsWith(continuationPrompt)) { + output.println(line) + line.drop(prompt.length) + } + else readLine(prompt) + } + val interactive = false + } + } + + def check() = { + out.close() + val printed = out.toString + val transcript = printed.drop(printed.indexOf(config.prompt)) + if (transcript.toString != script) { + println("input differs from transcript:") + println(transcript) + assert(false) + } + } +}
\ No newline at end of file diff --git a/test/test/showTree.scala b/test/test/showTree.scala index 2c3316ac9..8d5a5ad7c 100644 --- a/test/test/showTree.scala +++ b/test/test/showTree.scala @@ -3,7 +3,7 @@ import dotty.tools.dotc._ import ast.Trees._ import ast.desugar import ast.desugar._ -import typer.Mode +import core.Mode object showTree extends DeSugarTest { diff --git a/tests/neg/tailcall/t6574.scala b/tests/neg/tailcall/t6574.scala index 7030b3b4a..d9ba2882d 100644 --- a/tests/neg/tailcall/t6574.scala +++ b/tests/neg/tailcall/t6574.scala @@ -4,7 +4,7 @@ class Bad[X, Y](val v: Int) extends AnyVal { println("tail") } - @annotation.tailrec final def differentTypeArgs : Unit = { - {(); new Bad[String, Unit](0)}.differentTypeArgs + @annotation.tailrec final def differentTypeArgs : Unit = { // error + {(); new Bad[String, Unit](0)}.differentTypeArgs // error } } diff --git a/tests/neg/variances.scala b/tests/neg/variances.scala index 71ee504bc..d732bb6db 100644 --- a/tests/neg/variances.scala +++ b/tests/neg/variances.scala @@ -41,4 +41,19 @@ object Test2 extends App { } +trait HasY { type Y } + +// These are neg-tests corresponding to the pos-test Variances.scala +// where all the variance annotations have been inverted. +trait Foo1[+X] { def bar[Y <: X](y: Y) = y } // error +trait Foo2[+X] { def bar(x: HasY { type Y <: X })(y: x.Y) = y } // error +trait Foo3[-X] { def bar[Y >: X](y: Y) = y } // error +trait Foo4[-X] { def bar(x: HasY { type Y >: X })(y: x.Y) = y } // error + +// These are neg-tests corresponding to the pos-test Variances.scala +// where all the bounds have been flipped. +trait Foo5[-X] { def bar[Y >: X](y: Y) = y } // error +trait Foo6[-X] { def bar(x: HasY { type Y >: X })(y: x.Y) = y } // error +trait Foo7[+X] { def bar[Y <: X](y: Y) = y } // error +trait Foo8[+X] { def bar(x: HasY { type Y <: X })(y: x.Y) = y } // error diff --git a/tests/pos/tailcall/i1089.scala b/tests/pos/tailcall/i1089.scala new file mode 100644 index 000000000..8eb69cb9b --- /dev/null +++ b/tests/pos/tailcall/i1089.scala @@ -0,0 +1,26 @@ +package hello + +import scala.annotation.tailrec + +class Enclosing { + class SomeData(val x: Int) + + def localDef(): Unit = { + def foo(data: SomeData): Int = data.x + + @tailrec + def test(i: Int, data: SomeData): Unit = { + if (i != 0) { + println(foo(data)) + test(i - 1, data) + } + } + + test(3, new SomeData(42)) + } +} + +object world extends App { + println("hello dotty!") + new Enclosing().localDef() +} diff --git a/tests/pos/variances.scala b/tests/pos/variances.scala index db858fd5d..7ab9fe72a 100644 --- a/tests/pos/variances.scala +++ b/tests/pos/variances.scala @@ -1,3 +1,18 @@ trait C[+T <: C[T, U], -U <: C[T, U]] { } +trait HasY { type Y } + +// This works in scalac. +trait Foo1[-X] { def bar[Y <: X](y: Y) = y } + +// A variant of Foo1 using a dependent method type (doesn't work using +// scalac) +trait Foo2[-X] { def bar(x: HasY { type Y <: X })(y: x.Y) = y } + +// This works in scalac. +trait Foo3[+X] { def bar[Y >: X](y: Y) = y } + +// A variant of Foo3 using a dependent method type (doesn't work +// using scalac) +trait Foo4[+X] { def bar(x: HasY { type Y >: X })(y: x.Y) = y } diff --git a/tests/repl/import.check b/tests/repl/import.check new file mode 100644 index 000000000..ccaa52190 --- /dev/null +++ b/tests/repl/import.check @@ -0,0 +1,11 @@ +scala> import collection.mutable._ +import collection.mutable._ +scala> val buf = new ListBuffer[Int] +buf: scala.collection.mutable.ListBuffer[Int] = ListBuffer() +scala> buf += 22 +res0: scala.collection.mutable.ListBuffer[Int] = ListBuffer(22) +scala> buf ++= List(1, 2, 3) +res1: scala.collection.mutable.ListBuffer[Int] = ListBuffer(22, 1, 2, 3) +scala> buf.toList +res2: scala.collection.immutable.List[Int] = List(22, 1, 2, 3) +scala> :quit diff --git a/tests/repl/imports.check b/tests/repl/imports.check new file mode 100644 index 000000000..3fa103283 --- /dev/null +++ b/tests/repl/imports.check @@ -0,0 +1,24 @@ +scala> import scala.collection.mutable +import scala.collection.mutable +scala> val buf = mutable.ListBuffer[Int]() +buf: scala.collection.mutable.ListBuffer[Int] = ListBuffer() +scala> object o { + | val xs = List(1, 2, 3) + | } +defined module o +scala> import o._ +import o._ +scala> buf += xs +<console>:11: error: type mismatch: + found : scala.collection.immutable.List[Int](o.xs) + required: String +buf += xs + ^ +<console>:11: error: type mismatch: + found : String + required: scala.collection.mutable.ListBuffer[Int] +buf += xs +^ +scala> buf ++= xs +res1: scala.collection.mutable.ListBuffer[Int] = ListBuffer(1, 2, 3) +scala> :quit diff --git a/tests/repl/multilines.check b/tests/repl/multilines.check new file mode 100644 index 000000000..3bc32707e --- /dev/null +++ b/tests/repl/multilines.check @@ -0,0 +1,33 @@ +scala> val x = """alpha + | + | omega""" +x: String = +alpha + +omega +scala> val y = """abc + | |def + | |ghi + | """.stripMargin +y: String = +abc +def +ghi + +scala> val z = { + | def square(x: Int) = x * x + | val xs = List(1, 2, 3) + | square(xs) + | } +<console>:8: error: type mismatch: + found : scala.collection.immutable.List[Int](xs) + required: Int + square(xs) + ^ +scala> val z = { + | def square(x: Int) = x * x + | val xs = List(1, 2, 3) + | xs.map(square) + | } +z: scala.collection.immutable.List[Int] = List(1, 4, 9) +scala> :quit diff --git a/tests/repl/onePlusOne.check b/tests/repl/onePlusOne.check new file mode 100644 index 000000000..9db6e6817 --- /dev/null +++ b/tests/repl/onePlusOne.check @@ -0,0 +1,3 @@ +scala> 1+1 +res0: Int = 2 +scala> :quit |