aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorFelix Mulder <felix.mulder@gmail.com>2016-10-06 18:11:11 +0200
committerGitHub <noreply@github.com>2016-10-06 18:11:11 +0200
commit10ff9494165210b22eb80e989fc10c3ebf393bae (patch)
treeca9fa2f142b8e06d681d65eaf5afa5b44c7d98fc
parenteaa7f1730aa9da0aa7e4b2c4e86fbcc3acf26131 (diff)
parent237ddc31ab0281f9f2cddf598fc5f9af50f91f06 (diff)
downloaddotty-10ff9494165210b22eb80e989fc10c3ebf393bae.tar.gz
dotty-10ff9494165210b22eb80e989fc10c3ebf393bae.tar.bz2
dotty-10ff9494165210b22eb80e989fc10c3ebf393bae.zip
Merge pull request #1555 from felixmulder/topic/docs
Migrate wiki to docs dir
-rw-r--r--docs/.gitignore6
-rw-r--r--docs/Gemfile27
-rw-r--r--docs/HigherKinded-v2.md375
-rw-r--r--docs/_config.yml6
-rw-r--r--docs/_includes/scala-logo.html19
-rw-r--r--docs/_includes/toc.html37
-rw-r--r--docs/_layouts/blog.html18
-rw-r--r--docs/_layouts/default.html23
-rw-r--r--docs/_plugins/JekyllMarkdownLinkConverter/LICENSE202
-rw-r--r--docs/_plugins/JekyllMarkdownLinkConverter/converter.rb66
-rw-r--r--docs/blog/_posts/2015-10-23-dotty-compiler-bootstraps.md (renamed from docs/2015-10-23-dotty-compiler-bootstraps.md)27
-rw-r--r--docs/blog/_posts/2016-01-02-new-year-resolutions.md65
-rw-r--r--docs/blog/_posts/2016-02-03-essence-of-scala.md145
-rw-r--r--docs/blog/_posts/2016-02-17-scaling-dot-soundness.md158
-rw-r--r--docs/blog/_posts/2016-05-05-multiversal-equality.md89
-rw-r--r--docs/blog/index.html22
-rw-r--r--docs/contributing/eclipse.md50
-rw-r--r--docs/contributing/getting-started.md42
-rw-r--r--docs/contributing/intellij-idea.md36
-rw-r--r--docs/contributing/workflow.md81
-rw-r--r--docs/css/main.scss238
-rw-r--r--docs/dotc-internals/overall-structure.md174
-rw-r--r--docs/dotc-internals/periods.md94
-rw-r--r--docs/images/favicon.pngbin0 -> 3304 bytes
-rw-r--r--docs/images/felix.jpegbin0 -> 49140 bytes
-rw-r--r--docs/images/fengyun.pngbin0 -> 277113 bytes
-rw-r--r--docs/images/martin.jpgbin0 -> 20589 bytes
-rw-r--r--docs/images/nico.pngbin0 -> 579556 bytes
-rw-r--r--docs/images/petrashko.pngbin0 -> 92799 bytes
-rw-r--r--docs/images/smarter.jpgbin0 -> 131279 bytes
-rw-r--r--docs/index.md34
-rw-r--r--docs/internals/backend.md127
-rw-r--r--docs/internals/benchmarks.md5
-rw-r--r--docs/internals/classpaths.md42
-rw-r--r--docs/internals/contexts.md55
-rw-r--r--docs/internals/dotc-scalac.md104
-rw-r--r--docs/internals/higher-kinded-v2.md461
-rw-r--r--docs/internals/overall-structure.md191
-rw-r--r--docs/internals/periods.md96
-rw-r--r--docs/internals/type-system.md134
-rw-r--r--docs/js/highlight.pack.js2
-rw-r--r--docs/syntax-summary.txt (renamed from docs/SyntaxSummary.txt)0
-rw-r--r--docs/usage/migrating.md46
-rw-r--r--docs/usage/sbt-projects.md14
44 files changed, 2657 insertions, 654 deletions
diff --git a/docs/.gitignore b/docs/.gitignore
new file mode 100644
index 000000000..6f8e43ce5
--- /dev/null
+++ b/docs/.gitignore
@@ -0,0 +1,6 @@
+# Jekyll specific ignores
+vendor/
+.bundle/
+Gemfile.lock
+_site/
+.sass-cache/
diff --git a/docs/Gemfile b/docs/Gemfile
new file mode 100644
index 000000000..4f3c7d08f
--- /dev/null
+++ b/docs/Gemfile
@@ -0,0 +1,27 @@
+source "https://rubygems.org"
+ruby RUBY_VERSION
+
+# Hello! This is where you manage which Jekyll version is used to run.
+# When you want to use a different version, change it below, save the
+# file and run `bundle install`. Run Jekyll with `bundle exec`, like so:
+#
+# bundle exec jekyll serve
+#
+# This will help ensure the proper Jekyll version is running.
+# Happy Jekylling!
+gem "jekyll", "3.2.1"
+
+# This is the default theme for new Jekyll sites. You may change this to anything you like.
+gem "minima"
+
+# Table of contents
+gem 'jekyll-toc'
+
+# If you want to use GitHub Pages, remove the "gem "jekyll"" above and
+# uncomment the line below. To upgrade, run `bundle update github-pages`.
+# gem "github-pages", group: :jekyll_plugins
+
+# If you have any plugins, put them here!
+# group :jekyll_plugins do
+# gem "jekyll-github-metadata", "~> 1.0"
+# end
diff --git a/docs/HigherKinded-v2.md b/docs/HigherKinded-v2.md
deleted file mode 100644
index 2ca6424de..000000000
--- a/docs/HigherKinded-v2.md
+++ /dev/null
@@ -1,375 +0,0 @@
-Higher-Kinded Types in Dotty V2
-===============================
-
-This note outlines how we intend to represent higher-kinded types in
-Dotty. The principal idea is to collapse the four previously
-disparate features of refinements, type parameters, existentials and
-higher-kinded types into just one: refinements of type members. All
-other features will be encoded using these refinements.
-
-The complexity of type systems tends to grow exponentially with the
-number of independent features, because there are an exponential
-number of possible feature interactions. Consequently, a reduction
-from 4 to 1 fundamental features achieves a dramatic reduction of
-complexity. It also adds some nice usablilty improvements, notably in
-the area of partial type application.
-
-This is a second version of the scheme which differs in a key aspect
-from the first one: Following Adriaan's idea, we use traits with type
-members to model type lambdas and type applications. This is both more
-general and more robust than the intersections with type constructor
-traits that we had in the first version.
-
-The duality
------------
-
-The core idea: A parameterized class such as
-
- class Map[K, V]
-
-is treated as equivalent to a type with type members:
-
- class Map { type Map$K; type Map$V }
-
-The type members are name-mangled (i.e. `Map$K`) so that they do not conflict with other
-members or parameters named `K` or `V`.
-
-A type-instance such as `Map[String, Int]` would then be treated as equivalent to
-
- Map { type Map$K = String; type Map$V = Int }
-
-Named type parameters
----------------------
-
-Type parameters can have unmangled names. This is achieved by adding the `type` keyword
-to a type parameter declaration, analogous to how `val` indicates a named field. For instance,
-
- class Map[type K, type V]
-
-is treated as equivalent to
-
- class Map { type K; type V }
-
-The parameters are made visible as fields.
-
-Wildcards
----------
-
-A wildcard type such as `Map[_, Int]` is equivalent to
-
- Map { type Map$V = Int }
-
-I.e. `_`'s omit parameters from being instantiated. Wildcard arguments
-can have bounds. E.g.
-
- Map[_ <: AnyRef, Int]
-
-is equivalent to
-
- Map { type Map$K <: AnyRef; type Map$V = Int }
-
-
-Type parameters in the encodings
---------------------------------
-
-The notion of type parameters makes sense even for encoded types,
-which do not contain parameter lists in their syntax. Specifically,
-the type parameters of a type are a sequence of type fields that
-correspond to parameters in the unencoded type. They are determined as
-follows.
-
- - The type parameters of a class or trait type are those parameter fields declared in the class
- that are not yet instantiated, in the order they are given. Type parameter fields of parents
- are not considered.
- - The type parameters of an abstract type are the type parameters of its upper bound.
- - The type parameters of an alias type are the type parameters of its right hand side.
- - The type parameters of every other type is the empty sequence.
-
-Partial applications
---------------------
-
-The definition of type parameters in the previous section leads to a simple model of partial applications.
-Consider for instance:
-
- type Histogram = Map[_, Int]
-
-`Histogram` is a higher-kinded type that still has one type parameter.
-`Histogram[String]`
-would be a possible type instance, and it would be equivalent to `Map[String, Int]`.
-
-
-Modelling polymorphic type declarations
----------------------------------------
-
-The partial application scheme gives us a new -- and quite elegant --
-way to do certain higher-kinded types. But how do we interprete the
-poymorphic types that exist in current Scala?
-
-More concretely, current Scala allows us to write parameterized type
-definitions, abstract types, and type parameters. In the new scheme,
-only classes (and traits) can have parameters and these are treated as
-equivalent to type members. Type aliases and abstract types do not
-allow the definition of parameterized types so we have to interprete
-polymorphic type aliases and abstract types specially.
-
-Modelling polymorphic type aliases: simple case
------------------------------------------------
-
-A polymorphic type alias such as
-
- type Pair[T] = Tuple2[T, T]
-
-where `Tuple2` is declared as
-
- class Tuple2[T1, T2] ...
-
-is expanded to a monomorphic type alias like this:
-
- type Pair = Tuple2 { type Tuple2$T2 = Tuple2$T1 }
-
-More generally, each type parameter of the left-hand side must
-appear as a type member of the right hand side type. Type members
-must appear in the same order as their corresponding type parameters.
-References to the type parameter are then translated to references to the
-type member. The type member itself is left uninstantiated.
-
-This technique can expand most polymorphic type aliases appearing
-in Scala codebases but not all of them. For instance, the following
-alias cannot be expanded, because the parameter type `T` is not a
-type member of the right-hand side `List[List[T]]`.
-
- type List2[T] = List[List[T]]
-
-We scanned the Scala standard library for occurrences of polymorphic
-type aliases and determined that only two occurrences could not be expanded.
-In `io/Codec.scala`:
-
- type Configure[T] = (T => T, Boolean)
-
-And in `collection/immutable/HashMap.scala`:
-
- private type MergeFunction[A1, B1] = ((A1, B1), (A1, B1)) => (A1, B1)
-
-For these cases, we use a fall-back scheme that models a parameterized alias as a
-`Lambda` type.
-
-Modelling polymorphic type aliases: general case
-------------------------------------------------
-
-A polymorphic type alias such as
-
- type List2D[T] = List[List[T]]
-
-is represented as a monomorphic type alias of a type lambda. Here's the expanded version of
-the definition above:
-
- type List2D = Lambda$I { type Apply = List[List[$hkArg$0]] }
-
-Here, `Lambda$I` is a standard trait defined as follows:
-
- trait Lambda$I[type $hkArg$0] { type +Apply }
-
-The `I` suffix of the `Lambda` trait indicates that it has one invariant type parameter (named $hkArg$0).
-Other suffixes are `P` for covariant type parameters, and `N` for contravariant type parameters. Lambda traits can
-have more than one type parameter. For instance, here is a trait with contravariant and covariant type parameters:
-
- trait Lambda$NP[type -$hkArg$0, +$hkArg1] { type +Apply } extends Lambda$IP with Lambda$NI
-
-Aside: the `+` prefix in front of `Apply` indicates that `Apply` is a covariant type field. Dotty
-admits variance annotations on type members.
-
-The definition of `Lambda$NP` shows that `Lambda` traits form a subtyping hierarchy: Traits which
-have covariant or contravariant type parameters are subtypes of traits which don't. The supertraits
-of `Lambda$NP` would themselves be written as follows.
-
- trait Lambda$IP[type $hkArg$0, +$hkArg1] { type +Apply } extends Lambda$II
- trait Lambda$NI[type -$hkArg$0, $hkArg1] { type +Apply } extends Lambda$II
- trait Lambda$II[type $hkArg$0, $hkArg1] { type +Apply }
-
-`Lambda` traits are special in that
-they influence how type applications are expanded: If the standard type application `T[X1, ..., Xn]`
-leads to a subtype `S` of a type instance
-
- LambdaXYZ { type Arg1 = T1; ...; type ArgN = Tn; type Apply ... }
-
-where all argument fields `Arg1, ..., ArgN` are concretely defined
-and the definition of the `Apply` field may be either abstract or concrete, then the application
-is further expanded to `S # Apply`.
-
-For instance, the type instance `List2D[String]` would be expanded to
-
- Lambda$I { type $hkArg$0 = String; type Apply = List[List[String]] } # Apply
-
-which in turn simplifies to `List[List[String]]`.
-
-2nd Example: Consider the two aliases
-
- type RMap[K, V] = Map[V, K]
- type RRMap[K, V] = RMap[V, K]
-
-These expand as follows:
-
- type RMap = Lambda$II { self1 => type Apply = Map[self1.$hkArg$1, self1.$hkArg$0] }
- type RRMap = Lambda$II { self2 => type Apply = RMap[self2.$hkArg$1, self2.$hkArg$0] }
-
-Substituting the definition of `RMap` and expanding the type application gives:
-
- type RRMap = Lambda$II { self2 => type Apply =
- Lambda$II { self1 => type Apply = Map[self1.$hkArg$1, self1.$hkArg$0] }
- { type $hkArg$0 = self2.$hkArg$1; type $hkArg$1 = self2.$hkArg$0 } # Apply }
-
-Substituting the definitions for `self1.$hkArg${1,2}` gives:
-
- type RRMap = Lambda$II { self2 => type Apply =
- Lambda$II { self1 => type Apply = Map[self2.$hkArg$0, self2.$hkArg$1] }
- { type $hkArg$0 = self2.$hkArg$1; type $hkArg$1 = self2.$hkArg$0 } # Apply }
-
-Simplifiying the `# Apply` selection gives:
-
- type RRMap = Lambda$II { self2 => type Apply = Map[self2.$hkArg$0, self2.$hkArg$1] }
-
-This can be regarded as the eta-expanded version of `Map`. It has the same expansion as
-
- type IMap[K, V] = Map[K, V]
-
-
-Modelling higher-kinded types
------------------------------
-
-The encoding of higher-kinded types uses again the `Lambda` traits to represent type constructors.
-Consider the higher-kinded type declaration
-
- type Rep[T]
-
-We expand this to
-
- type Rep <: Lambda$I
-
-The type parameters of `Rep` are the type parameters of its upper bound, so
-`Rep` is a unary type constructor.
-
-More generally, a higher-kinded type declaration
-
- type T[v1 X1 >: S1 <: U1, ..., vn XN >: S1 <: UN] >: SR <: UR
-
-is encoded as
-
- type T <: LambdaV1...Vn { self =>
- type v1 $hkArg$0 >: s(S1) <: s(U1)
- ...
- type vn $hkArg$N >: s(SN) <: s(UN)
- type Apply >: s(SR) <: s(UR)
- }
-
-where `s` is the substitution `[XI := self.$hkArg$I | I = 1,...,N]`.
-
-If we instantiate `Rep` with a type argument, this is expanded as was explained before.
-
- Rep[String]
-
-would expand to
-
- Rep { type $hkArg$0 = String } # Apply
-
-If we instantiate the higher-kinded type with a concrete type constructor (i.e. a parameterized
-trait or class), we have to do one extra adaptation to make it work. The parameterized trait
-or class has to be eta-expanded so that it comforms to the `Lambda` bound. For instance,
-
- type Rep = Set
-
-would expand to
-
- type Rep = Lambda1 { type Apply = Set[$hkArg$0] }
-
-Or,
-
- type Rep = Map[String, _]
-
-would expand to
-
- type Rep = Lambda1 { type Apply = Map[String, $hkArg$0] }
-
-
-Full example
-------------
-
-Consider the higher-kinded `Functor` type class
-
- class Functor[F[_]] {
- def map[A, B](f: A => B): F[A] => F[B]
- }
-
-This would be represented as follows:
-
- class Functor[F <: Lambda1] {
- def map[A, B](f: A => B): F { type $hkArg$0 = A } # Apply => F { type $hkArg$0 = B } # Apply
- }
-
-The type `Functor[List]` would be represented as follows
-
- Functor {
- type F = Lambda1 { type Apply = List[$hkArg$0] }
- }
-
-Now, assume we have a value
-
- val ml: Functor[List]
-
-Then `ml.map` would have type
-
- s(F { type $hkArg$0 = A } # Apply => F { type $hkArg$0 = B } # Apply)
-
-where `s` is the substitution of `[F := Lambda1 { type Apply = List[$hkArg$0] }]`.
-This gives:
-
- Lambda1 { type Apply = List[$hkArg$0] } { type $hkArg$0 = A } # Apply
- => Lambda1 { type Apply = List[$hkArg$0] } { type $hkArg$0 = B } # Apply
-
-This type simplifies to:
-
- List[A] => List[B]
-
-Status of #
------------
-
-In the scheme above we have silently assumed that `#` "does the right
-thing", i.e. that the types are well-formed and we can collapse a type
-alias with a `#` projection, thereby giving us a form of beta
-reduction.
-
-In Scala 2.x, this would not work, because `T#X` means `x.X forSome { val x: T }`.
-Hence, two occurrences of `Rep[Int]` say, would not be recognized to be equal because the
-existential would be opened each time afresh.
-
-In pre-existentials Scala, this would not have worked either. There, `T#X` was a fundamental
-type constructor, but was restricted to alias types or classes for both `T` and `X`.
-Roughly, `#` was meant to encode Java's inner classes. In Java, given the classes
-
- class Outer { class Inner }
- class Sub1 extends Outer
- class Sub2 extends Outer
-
-The types `Outer#Inner`, `Sub1#Inner` and `Sub2#Inner` would all exist and be
-regarded as equal to each other. But if `Outer` had abstract type members this would
-not work, since an abstract type member could be instantiated differently in `Sub1` and `Sub2`.
-Assuming that `Sub1#Inner = Sub2#Inner` could then lead to a soundness hole. To avoid soundness
-problems, the types in `X#Y` were restricted so that `Y` was (an alias of) a class type and
-`X` was (an alias of) a class type with no abstract type members.
-
-I believe we can go back to regarding `T#X` as a fundamental type constructor, the way it
-was done in pre-existential Scala, but with the following relaxed restriction:
-
- _In a type selection `T#x`, `T` is not allowed to have any abstract members different from `X`._
-
-This would typecheck the higher-kinded types examples, because they only project with `# Apply` once all
-`$hkArg$` type members are fully instantiated.
-
-It would be good to study this rule formally, trying to verify its soundness.
-
-
-
-
-
-
-
-
diff --git a/docs/_config.yml b/docs/_config.yml
new file mode 100644
index 000000000..1460cbaac
--- /dev/null
+++ b/docs/_config.yml
@@ -0,0 +1,6 @@
+title: Dotty Documentation
+baseurl: "/"
+markdown: JekyllMarkdownLinkConverter
+theme: minima
+gems:
+ - jekyll-toc
diff --git a/docs/_includes/scala-logo.html b/docs/_includes/scala-logo.html
new file mode 100644
index 000000000..bfabf88f7
--- /dev/null
+++ b/docs/_includes/scala-logo.html
@@ -0,0 +1,19 @@
+<svg width="64" height="110" xmlns="http://www.w3.org/2000/svg">
+ <g>
+ <title>background</title>
+ <rect x="-1" y="-1" width="9.664668" height="15.173648" id="canvas_background" fill="none"/>
+ </g>
+
+ <g>
+ <title>Layer 1</title>
+ <g id="logo">
+ <g opacity="0.61" id="svg_1">
+ <path fill="#FFFFFF" d="m0.5,41.9c0,0 62.9,6.3 62.9,16.8l0,-25.2c0,0 0,-10.5 -62.9,-16.8l0,9.8l0,15.4z" id="svg_2"/>
+ <path fill="#FFFFFF" d="m0.5,75.5c0,0 62.9,6.3 62.9,16.8l0,-25.2c0,0 0,-10.5 -62.9,-16.8l0,25.2z" id="svg_3"/>
+ </g>
+ <path fill="#FFFFFF" d="m63.5,0l0,25.2c0,0 0,10.5 -62.9,16.8l0,-25.3c-0.1,0 62.9,-6.2 62.9,-16.7" id="svg_4"/>
+ <path fill="#FFFFFF" d="m0.5,50.3c0,0 62.9,-6.3 62.9,-16.8l0,25.2c0,0 0,10.5 -62.9,16.8l0,-25.2z" id="svg_5"/>
+ <path fill="#FFFFFF" d="m0.5,109l0,-25.2c0,0 62.9,-6.3 62.9,-16.8l0,25.2c0.1,0.1 0.1,10.5 -62.9,16.8" id="svg_6"/>
+ </g>
+ </g>
+</svg>
diff --git a/docs/_includes/toc.html b/docs/_includes/toc.html
new file mode 100644
index 000000000..213926856
--- /dev/null
+++ b/docs/_includes/toc.html
@@ -0,0 +1,37 @@
+<div>
+ <div id="scala-logo">
+ {% include scala-logo.html %}
+ </div>
+ <ul id="categories">
+ <li><ul><li><a href="/blog">Blog</a></li></ul></li>
+ <li><ul><li><a href="/">Dotty Docs</a></li></ul></li>
+ <li>
+ Usage
+ <ul>
+ <li><a href="/usage/migrating.html">Migrating from Scala 2</a></li>
+ <li><a href="/usage/sbt-projects.html">Using Dotty with sbt</a></li>
+ </ul>
+ </li>
+ <li>
+ Contributing
+ <ul>
+ <li><a href="/contributing/getting-started.html">Getting Started</a></li>
+ <li><a href="/contributing/workflow.html">Workflow</a></li>
+ <li><a href="/contributing/eclipse.html">Eclipse</a></li>
+ <li><a href="/contributing/intellij-idea.html">Intellij-IDEA</a></li>
+ </ul>
+ </li>
+ <li>
+ Internals
+ <ul>
+ <li><a href="/internals/backend.html">Backend</a></li>
+ <li><a href="/internals/contexts.html">Contexts</a></li>
+ <li><a href="/internals/higher-kinded-v2.html">Higher Kinded Type Scheme</a></li>
+ <li><a href="/internals/overall-structure.html">Project Structure</a></li>
+ <li><a href="/internals/periods.html">Periods</a></li>
+ <li><a href="/internals/type-system.html">Type System</a></li>
+ <li><a href="/internals/dotc-scalac.html">Dotty vs Scala2</a></li>
+ </ul>
+ </li>
+ </ul>
+</div>
diff --git a/docs/_layouts/blog.html b/docs/_layouts/blog.html
new file mode 100644
index 000000000..fde08e14e
--- /dev/null
+++ b/docs/_layouts/blog.html
@@ -0,0 +1,18 @@
+---
+layout: default
+---
+
+<h1 class="title">{{ page.title }}</h1>
+<h2 class="subtitle">{{ page.subTitle }}</h2>
+
+<div class="author-container {% if page.authorImg != null %} spaced {% endif %}">
+ {% if page.authorImg != null %}
+ <img src="{{ page.authorImg }}"/>
+ {% endif %}
+ <div class="author-info">
+ <div>{{ page.author }}</div>
+ <div>{{ page.date | date: '%B %d, %Y' }}</div>
+ </div>
+</div>
+
+{{ content }}
diff --git a/docs/_layouts/default.html b/docs/_layouts/default.html
new file mode 100644
index 000000000..19cd7ce59
--- /dev/null
+++ b/docs/_layouts/default.html
@@ -0,0 +1,23 @@
+<html>
+ <head>
+ <title>Dotty - {{ page.title }}</title>
+ <link rel="shortcut icon" type="image/png" href="/images/favicon.png"/>
+ <link rel="stylesheet" href="http://cdnjs.cloudflare.com/ajax/libs/highlight.js/9.7.0/styles/github.min.css">
+ <link rel="stylesheet" href="/css/main.css">
+ </head>
+ <body>
+ <div id="container">
+ <div id="scala-logo-mobile">
+ {% include scala-logo.html %}
+ </div>
+ <div id="content">
+ {{ content }}
+ </div>
+ <div id="toc">
+ {% include toc.html %}
+ </div>
+ </div>
+ </body>
+ <script src="/js/highlight.pack.js"></script>
+ <script>hljs.initHighlightingOnLoad();</script>
+</html>
diff --git a/docs/_plugins/JekyllMarkdownLinkConverter/LICENSE b/docs/_plugins/JekyllMarkdownLinkConverter/LICENSE
new file mode 100644
index 000000000..8f71f43fe
--- /dev/null
+++ b/docs/_plugins/JekyllMarkdownLinkConverter/LICENSE
@@ -0,0 +1,202 @@
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "{}"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright {yyyy} {name of copyright owner}
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+
diff --git a/docs/_plugins/JekyllMarkdownLinkConverter/converter.rb b/docs/_plugins/JekyllMarkdownLinkConverter/converter.rb
new file mode 100644
index 000000000..95b8f1fb4
--- /dev/null
+++ b/docs/_plugins/JekyllMarkdownLinkConverter/converter.rb
@@ -0,0 +1,66 @@
+# We need to be placed in Kramdown::Converter.
+# When calling a method on a Kramdown document of the form "to_****",
+# Kramdown will shave off the "****" portion, Pascal-case it, and
+# look for a class of that name in Kramdown::Converter.
+# See https://github.com/gettalong/kramdown/blob/05d467bfb9abb732046e441ef1958471195d665d/lib/kramdown/document.rb#L113-L116
+module Kramdown
+ module Converter
+ # Fixes relative markdown links to point to their dash-separated, lowercased html outputs
+ class MarkdownLinkAmendedHtml < Html
+ def convert_a(el, indent)
+ href = el.attr['href']
+ # Ensure that the link is relative to the site (doesn't start with "protocol://")
+ # and that it links to a markdown file.
+ if not /^\w+?:\/\// =~ href and href.end_with?('.md')
+ # Duplicate the attributes to avoid modifying the tree.
+ attr = el.attr.dup
+
+ # Remove the 'md', replace whitespace with dashes, switch the extension to html
+ dir, md_base = File.split(href)
+ html_base = md_base.chomp('.md').gsub(/\s+|\.|'/, '-').downcase + '.html'
+ attr['href'] = File.join(dir, html_base)
+
+ self.format_as_span_html(el.type, attr, self.inner(el, indent))
+ else
+ super(el, indent)
+ end
+ end
+ end
+ end
+end
+
+# One requirement for Jekyll not to freak out
+# is that we need to be in the Jekyll::Converters::Markdown
+# module to be a valid markdown converter.
+module Jekyll
+ class Converters::Markdown::JekyllMarkdownLinkConverter < Converter
+ safe true
+
+ # Match markdown files.
+ def matches(ext)
+ ext =~ /^\.md$/i
+ end
+
+ # Output html files.
+ def output_ext(ext)
+ '.html'
+ end
+
+ def convert(content)
+ kramdown_config = symbolize_keys(@config['kramdown'])
+ doc = Kramdown::Document.new(content, kramdown_config)
+ html = doc.to_markdown_link_amended_html
+ return html;
+ end
+ end
+end
+
+def symbolize_keys(input)
+ result = {}
+
+ input.each do |k,v|
+ result[k.intern] = v
+ end
+
+ result
+end \ No newline at end of file
diff --git a/docs/2015-10-23-dotty-compiler-bootstraps.md b/docs/blog/_posts/2015-10-23-dotty-compiler-bootstraps.md
index cdd472b7f..b6ee44020 100644
--- a/docs/2015-10-23-dotty-compiler-bootstraps.md
+++ b/docs/blog/_posts/2015-10-23-dotty-compiler-bootstraps.md
@@ -1,12 +1,10 @@
---
layout: blog
-post-type: blog
-by: Martin Odersky and Dmitry Petrashko
-title: We Got LiftOff! The Dotty Compiler for Scala Bootstraps.
+author: Martin Odersky and Dmitry Petrashko
+title: "We got liftoff!"
+subTitle: The Dotty compiler for Scala bootstraps.
---
-## We Got Liftoff!
-
The [Dotty project](https://github.com/lampepfl/dotty)
is a platform to develop new technology for Scala
tooling and to try out concepts of future Scala language versions.
@@ -14,7 +12,7 @@ Its compiler is a new design intended to reflect the
lessons we learned from work with the Scala compiler. A clean redesign
today will let us iterate faster with new ideas in the future.
-Today we reached an important milestone: The Dotty compiler can
+Today we reached an important milestone: the Dotty compiler can
compile itself, and the compiled compiler can act as a drop-in for the
original one. This is what one calls a *bootstrap*.
@@ -35,7 +33,7 @@ go unnoticed, precisely because every part of a compiler feeds into
other parts and all together are necessary to produce a correct
translation.
-## Are We Done Yet?
+## Are we done yet?
Far from it! The compiler is still very rough. A lot more work is
needed to
@@ -43,21 +41,28 @@ needed to
- make it more robust, in particular when analyzing incorrect programs,
- improve error messages and warnings,
- improve the efficiency of some of the generated code,
+ - improve compilation speed,
- embed it in external tools such as sbt, REPL, IDEs,
- remove restrictions on what Scala code can be compiled,
- help in migrating Scala code that will have to be changed.
-## What Are the Next Steps?
+## What are the next steps?
Over the coming weeks and months, we plan to work on the following topics:
- Make snapshot releases.
- - Get the Scala standard library to compile.
- Work on SBT integration of the compiler.
- Work on IDE support.
- Investigate the best way to obtaining a REPL.
- Work on the build infrastructure.
-If you want to get your hands dirty with any of this, now is a good moment to get involved!
-To get started: <https://github.com/lampepfl/dotty>.
+If you want to get your hands dirty with any of this, now is a good
+moment to get involved! Join the team of contributors, including
+Dmitry Petrashko ([@DarkDimius](https://github.com/DarkDimius)),
+Guillaume Martres ([@smarter](https://github.com/smarter)),
+Ondrey Lhotak ([@olhotak](https://github.com/olhotak)),
+Samuel Gruetter ([@samuelgruetter](https://github.com/samuelgruetter)),
+Vera Salvis ([@vsalvis](https://github.com/vsalvis)),
+and Jason Zaugg ([@retronym](https://github.com/retronym)).
+To get started: <https://github.com/lampepfl/dotty>.
diff --git a/docs/blog/_posts/2016-01-02-new-year-resolutions.md b/docs/blog/_posts/2016-01-02-new-year-resolutions.md
new file mode 100644
index 000000000..a4ce3a54e
--- /dev/null
+++ b/docs/blog/_posts/2016-01-02-new-year-resolutions.md
@@ -0,0 +1,65 @@
+---
+layout: blog
+title: New Year Resolutions
+author: Martin Odersky
+authorImg: /images/martin.jpg
+---
+
+For most of us, the change of the year is an occasion for thinking
+about what we missed doing last year and where we want to improve. I decided
+there are a couple of things where I would like to do better in 2016
+than in 2015. The first is that I would like to do more blogging and
+writing in general. I have been pretty silent for most of the last
+year. This was mostly caused by the fact that I had been heads down to
+work on DOT, Scala's foundations, and _dotty_, the new Scala compiler
+platform we are working on. It's been a lot of work, but we are finally
+getting good results. DOT now has a mechanized proof of type soundness
+and the dotty compiler [can now compile
+itself](http://www.scala-lang.org/blog/2015/10/23/dotty-compiler-bootstraps.html)
+as well as large parts of Scala's standard library.
+
+The dotty compiler has a completely new and quite unusual
+architecture, which makes it resemble a functional database or a
+functional reactive program. My [talk at the JVM language
+summit](https://www.youtube.com/watch?v=WxyyJyB_Ssc) gives an
+overview. In the coming months I want to write together with my
+collaborators a series of blog posts
+ that explain details of the code base. The
+aim of these posts will be to present the new architectural patterns
+to a larger audience and also to help existing and potential
+contributors get familiar with the code base.
+
+My second resolution is to take a larger effort to promote simplicity
+in Scala. I believe the recent [blog post by Jim
+Plush](http://jimplush.com/talk/2015/12/19/moving-a-team-from-scala-to-golang/) should be a wakeup call for our
+community. Scala is a very powerful and un-opinionated language. This
+means we have a large spectrum of choice how to write a Scala
+application or library. It's very important for all of us to use this
+power wisely, and to promote simplicity of usage wherever possible.
+Unfortunately, most of us fall all too easily into the complexity
+trap, as Alex Payne's tweet sums it up very nicely.
+
+<blockquote class="twitter-tweet" lang="en"><p lang="en" dir="ltr">“Complexity is like a bug light for smart people. We can&#39;t resist it, even though we know it&#39;s bad for us.” <a href="https://t.co/V9Izi573CF">https://t.co/V9Izi573CF</a></p>&mdash; Alex Payne (@al3x) <a href="https://twitter.com/al3x/status/683036775942496256">January 1, 2016</a></blockquote>
+<script async src="//platform.twitter.com/widgets.js" charset="utf-8"></script>
+
+I have been as guilty of complication as everybody else. Is
+`CanBuildFrom` the most appropriate solution to deal with the
+constraints of embedding special types such as arrays and strings in a
+collection library? It achieves its purpose of providing a uniform
+user-level API on disparate datatypes. But I now think with more
+effort we might be able come up with a solution that works as well and
+is simpler. Another example, where I have doubts if not regrets are
+the `/:` and `:\` operators in scala.collections. They are cute
+synonyms for folds, and I am still fond of the analogy with falling
+dominoes they evoke. But in retrospect I think maybe they did give a
+bad example for others to go overboard with symbolic operators.
+
+So my main agenda for the coming year is to work on making Scala
+simpler: The language, its foundations, its libraries. I hope you
+will join me in that venture.
+
+With that thought, I wish you a happy new year 2016.
+
+
+
+
diff --git a/docs/blog/_posts/2016-02-03-essence-of-scala.md b/docs/blog/_posts/2016-02-03-essence-of-scala.md
new file mode 100644
index 000000000..0d457e0d8
--- /dev/null
+++ b/docs/blog/_posts/2016-02-03-essence-of-scala.md
@@ -0,0 +1,145 @@
+---
+layout: blog
+title: The Essence of Scala
+author: Martin Odersky
+authorImg: /images/martin.jpg
+---
+
+What do you get if you boil Scala on a slow flame and wait until all
+incidental features evaporate and only the most concentrated essence
+remains? After doing this for 8 years we believe we have the answer:
+it's DOT, the calculus of dependent object types, that underlies Scala.
+
+A [paper on DOT](http://infoscience.epfl.ch/record/215280) will be
+presented in April at [Wadlerfest](http://events.inf.ed.ac.uk/wf2016),
+an event celebrating Phil Wadler's 60th birthday. There's also a prior
+technical report ([From F to DOT](http://arxiv.org/abs/1510.05216))
+by Tiark Rompf and Nada Amin describing a slightly different version
+of the calculus. Each paper describes a proof of type soundness that
+has been machine-checked for correctness.
+
+## The DOT calculus
+
+A calculus is a kind of mini-language that is small enough to be
+studied formally. Translated to Scala notation, the language covered
+by DOT is described by the following abstract grammar:
+
+ Value v = (x: T) => t Function
+ new { x: T => ds } Object
+
+ Definition d = def a = t Method definition
+ type A = T Type
+
+ Term t = v Value
+ x Variable
+ t1(t2) Application
+ t.a Selection
+ { val x = t1; t2 } Local definition
+
+ Type T = Any Top type
+ Nothing Bottom type
+ x.A Selection
+ (x: T1) => T2 Function
+ { def a: T } Method declaration
+ { type T >: T1 <: T2 } Type declaration
+ T1 & T2 Intersection
+ { x => T } Recursion
+
+The grammar uses several kinds of names:
+
+ x for (immutable) variables
+ a for (parameterless) methods
+ A for types
+
+The full calculus adds to this syntax formal _typing rules_ that
+assign types `T` to terms `t` and formal _evaluation rules_ that
+describe how a program is evaluated. The following _type soundness_
+property was shown with a mechanized, (i.e. machine-checked) proof:
+
+> If a term `t` has type `T`, and the evaluation of `t` terminates, then
+ the result of the evaluation will be a value `v` of type `T`.
+
+## Difficulties
+
+Formulating the precise soundness theorem and proving it was unexpectedly hard,
+because it uncovered some technical challenges that had not been
+studied in depth before. In DOT - as well as in many programming languages -
+you can have conflicting definitions. For instance you might have an abstract
+type declaration in a base class with two conflicting aliases in subclasses:
+
+ trait Base { type A }
+ trait Sub1 extends Base { type A = String }
+ trait Sub2 extends Base { type A = Int }
+ trait Bad extends Sub1 with Sub2
+
+Now, if you combine `Sub1` and `Sub2` in trait `Bad` you get a conflict,
+since the type `A` is supposed to be equal to both `String` and `Int`. If you do
+not detect the conflict and assume the equalities at face value you
+get `String = A = Int`, hence by transitivity `String = Int`! Once you
+are that far, you can of course engineer all sorts of situations where
+a program will typecheck but cause a wrong execution at runtime. In
+other words, type soundness is violated.
+
+Now, the problem is that one cannot always detect these
+inconsistencies, at least not by a local analysis that does not need
+to look at the whole program. What's worse, once you have an
+inconsistent set of definitions you can use these definitions to
+"prove" their own consistency - much like a mathematical theory that
+assumes `true = false` can "prove" every proposition including its own
+correctness.
+
+The crucial reason why type soundness still holds is this: If one
+compares `T` with an alias, one does so always relative to some _path_
+`x` that refers to the object containing `T`. So it's really `x.T =
+Int`. Now, we can show that during evaluation every such path refers
+to some object that was created with a `new`, and that, furthermore,
+every such object has consistent type definitions. The tricky bit is
+to carefully distinguish between the full typing rules, which allow
+inconsistencies, and the typing rules arising from runtime values,
+which do not.
+
+## Why is This Important?
+
+There are at least four reasons why insights obtained in the DOT
+project are important.
+
+ 1. They give us a well-founded explanation of _nominal typing_.
+ Nominal typing means that a type is distinguished from others
+ simply by having a different name.
+ For instance, given two trait definitions
+
+ trait A extends AnyRef { def f: Int }
+ trait B extends AnyRef { def f: Int }
+
+ we consider `A` and `B` to be different types, even though both
+ traits have the same parents and both define the same members.
+ The opposite of
+ nominal typing is structural typing, which treats types
+ that have the same structure as being the same. Most programming
+ languages are at least in part nominal whereas most formal type systems,
+ including DOT, are structural. But the abstract types in DOT
+ provide a way to express nominal types such as classes and traits.
+ The Wadlerfest paper contains examples that show how
+ one can express classes for standard types such as `Boolean` and `List` in DOT.
+
+ 2. They give us a stable basis on which we can study richer languages
+ that resemble Scala more closely. For instance, we can encode
+ type parameters as type members of objects in DOT. This encoding
+ can give us a better understanding of the interactions of
+ subtyping and generics. It can explain why variance rules
+ are the way they are and what the precise typing rules for
+ wildcard parameters `[_ <: T]`, `[_ >: T]` should be.
+
+ 3. DOT also provides a blueprint for Scala compilation. The new Scala
+ compiler _dotty_ has internal data structures that closely resemble DOT.
+ In particular, type parameters are immediately mapped to type members,
+ in the way we propose to encode them also in the calculus.
+
+ 4. Finally, the proof principles explored in the DOT work give us guidelines
+ to assess and treat other possible soundness issues. We now know much
+ better what conditions must be fulfilled to ensure type soundness.
+ This lets us put other constructs of the Scala language to the test,
+ either to increase our confidence that they are indeed sound, or
+ to show that they are unsound. In my next blog I will
+ present some of the issues we have discovered through that exercise.
+
diff --git a/docs/blog/_posts/2016-02-17-scaling-dot-soundness.md b/docs/blog/_posts/2016-02-17-scaling-dot-soundness.md
new file mode 100644
index 000000000..0719cc3aa
--- /dev/null
+++ b/docs/blog/_posts/2016-02-17-scaling-dot-soundness.md
@@ -0,0 +1,158 @@
+---
+layout: blog
+title: Scaling DOT to Scala - Soundness
+author: Martin Odersky
+authorImg: /images/martin.jpg
+---
+
+In my [last
+blog post](http://www.scala-lang.org/blog/2016/02/03/essence-of-scala.html)
+I introduced DOT, a minimal calculus that underlies much of Scala.
+DOT is much more than an academic exercise, because it gives us
+guidelines on how to design a sound type system for full Scala.
+
+## Recap: The Problem of Bad Bounds
+
+As was argued in the previous blog post, the danger a path-dependent type
+system like Scala's faces is inconsistent bounds or aliases. For
+instance, you might have a type alias
+
+ type T = String
+
+in scope in some part of the program, but in another part the same
+type member `T` is known as
+
+ type T = Int
+
+If you connect the two parts, you end up allowing assigning a `String`
+to an `Int` and vice versa, which is unsound - it will crash at
+runtime with a `ClassCastException`. The problem is that there
+is no obvious, practical, compile time analysis for DOT or
+Scala that ensures that all types have good bounds. Types can contain
+abstract type members with bounds that can be refined elsewhere and
+several independent refinements might lead together to a bad bound
+problem. Barring a whole program analysis there is no specific
+point in the program where we can figure this out straightforwardly.
+
+In DOT, the problem is resolved by insisting that every path prefix `p`
+of a type `p.T` is at runtime a concrete value. That way, we only have
+to check for good bounds when objects are _created_ with `new`, and
+that check is easy: When objects are created, we know their class and
+we can insist that all nested types in that class are aliases or
+have consistent bounds. So far so good.
+
+## Loopholes Caused by Scaling Up
+
+But if we want to scale up the DOT result for full Scala, several
+loopholes open up. These come all down to the fact that the prefix of
+a type selection might _not_ be a value that's constructed with a
+`new` at run time. The loopholes can be classified into three
+categories:
+
+ 1. The prefix value might be lazy, and never instantiated to anything, as in:
+
+ lazy val p: S = p
+ ... p.T ...
+
+ Note that trying to access the lazy value `p` would result in an infinite loop. But using `p` in a type does not force its evaluation, so we might never evaluate `p`. Since `p` is not initialized with a `new`, bad bounds for `T` would go undetected.
+
+ 2. The prefix value might be initialized to `null`, as in
+
+ val p: S = null
+ ... p.T ...
+
+ The problem here is similar to the first one. `p` is not initialized
+ with a `new` so we know nothing about the bounds of `T`.
+
+ 3. The prefix might be a type `T` in a type projection `T # A`, where `T`
+ is not associated with a runtime value.
+
+We can in fact construct soundness issues in all of these cases. Look
+at the discussion for issues [#50](https://github.com/lampepfl/dotty/issues/50)
+and [#1050](https://github.com/lampepfl/dotty/issues/1050) in the
+[dotty](https://github.com/lampepfl/dotty/issues/1050) repository
+on GitHub. All issues work fundamentally in the same way: Construct a type `S`
+which has a type member `T` with bad bounds, say
+
+ Any <: T <: Nothing
+
+Then, use the left subtyping to turn an expression of type `Any` into
+an expression of type `T` and use the right subtyping to turn that
+expression into an expression of type `Nothing`:
+
+ def f(x: Any): p.T = x
+ def g(x: p.T): Nothing = x
+
+Taken together, `g(f(x))` will convert every expression into an
+expression of type `Nothing`. Since `Nothing` is a subtype of every
+other type, this means you can convert an arbitrary expression to have
+any type you choose. Such a feat is an impossible promise, of
+course. The promise is usually broken at run-time by failing with a
+`ClassCastException`.
+
+## Plugging the Loopholes
+
+To get back to soundness we need to plug the loopholes. Some of the
+necessary measures are taken in pull request [#1051](https://github.com/lampepfl/dotty/issues/1051).
+That pull request
+
+ - tightens the rules for overrides of lazy values: lazy values
+ cannot override or implement non-lazy values,
+ - tightens the rules which lazy values can appear in paths: they
+ must be final and must have concrete types with known consistent bounds,
+ - allows type projections `T # A` only if `T` is a concrete type
+ with known consistent bounds.
+
+It looks like this is sufficient to plug soundness problems (1) and
+(3). To plug (2), we need to make the type system track nullability in
+more detail than we do it now. Nullability tracking is a nice feature
+in its own right, but now we have an added incentive for implementing
+it: it would help to ensure type soundness.
+
+There's one sub-case of nullability checking which is much harder to do
+than the others. An object reference `x.f` might be `null` at run time
+because the field `f` is not yet initialized. This can lead to a
+soundness problem, but in a more roundabout way than the other issues
+we have identified. In fact, Scala guarantees that in a program that
+runs to completion without aborting, every field will eventually be
+initialized, so every non-null field will have good bounds. Therefore,
+the only way an initialized field `f` could cause a soundness problem
+is if the program in question would never get to initialize `f`,
+either because it goes into an infinite loop or because it aborts with
+an exception or `System.exit` call before reaching the initialization
+point of `f`. It's a valid question whether type soundness guarantees
+should extend to this class of "strange" programs. We might want to
+draw the line here and resort to runtime checks or exclude "strange"
+programs from any soundness guarantees we can give. The research community
+has coined the term [soundiness](http://soundiness.org/) for
+this kind of approach and has [advocated](http://cacm.acm.org/magazines/2015/2/182650-in-defense-of-soundiness/fulltext) for it.
+
+The necessary restrictions on type projection `T # A` are problematic
+because they invalidate some idioms in type-level programming. For
+instance, the cute trick of making Scala's type system Turing complete
+by having it [simulate SK
+combinators](https://michid.wordpress.com/2010/01/29/scala-type-level-encoding-of-the-ski-calculus/)
+would no longer work since that one relies on unrestricted type
+projections. The same holds for some of the encodings of type-level
+arithmetic.
+
+To ease the transition, we will continue for a while to allow unrestricted type
+projections under a flag, even though they are potentially
+unsound. In the current dotty compiler, that flag is a language import
+`-language:Scala2`, but it could be something different for other
+compilers, e.g. `-unsafe`. Maybe we can find rules that are less
+restrictive than the ones we have now, and are still sound. But one
+aspect should be non-negotiable: Any fundamental deviations from the
+principles laid down by DOT needs to be proven mechanically correct
+just like DOT was. We have achieved a lot with the DOT proofs, so we
+should make sure not to back-slide. And if the experience of the past
+10 years has taught us one thing, it is that the meta theory of type
+systems has many more surprises in store than one might think. That's
+why mechanical proofs are essential.
+
+
+
+
+
+
+
diff --git a/docs/blog/_posts/2016-05-05-multiversal-equality.md b/docs/blog/_posts/2016-05-05-multiversal-equality.md
new file mode 100644
index 000000000..83bc67059
--- /dev/null
+++ b/docs/blog/_posts/2016-05-05-multiversal-equality.md
@@ -0,0 +1,89 @@
+---
+layout: blog
+title: Multiversal Equality for Scala
+author: Martin Odersky
+authorImg: /images/martin.jpg
+---
+
+I have been working recently on making equality tests using `==` and `!=` safer in Scala. This has led to a [Language Enhancement Proposal](https://github.com/lampepfl/dotty/issues/1247) which I summarize in this blog.
+
+## Why Change Equality?
+
+Scala prides itself of its strong static type system. Its type discipline is particularly useful when it comes to refactoring. Indeed, it's possible to write programs in such a way that refactoring problems show up with very high probability as type errors. This is essential for being able to refactor with the confidence that nothing will break. And the ability to do such refactorings is in turn very important for keeping code bases from rotting.
+
+Of course, getting such a robust code base requires the cooperation of the developers. They should avoid type `Any`, casts, [stringly typed](http://c2.com/cgi/wiki?StringlyTyped) logic, and more generally any operation over loose types that do not capture the important properties of a value. Unfortunately, there is one area in Scala where such loose types are very hard to avoid: That's equality. Comparisons with `==` and `!=` are _universal_. They compare any two values, no matter what their types are. This causes real problems for writing code and more problems for refactoring it.
+
+For instance, one might want to introduce a proxy for some data structure so that instead of accessing the data structure directly one goes through the proxy. The proxy and the underlying data would have different types. Normally this should be an easy refactoring. If one passes by accident a proxy for the underlying type or _vice versa_ the type checker will flag the error. However, if one accidentally compares a proxy with the underlying type using `==` or a pattern match, the program is still valid, but will just always say `false`. This is a real worry in practice. I recently abandoned a desirable extensive refactoring because I feared that it would be too hard to track down such errors.
+
+## Where Are We Today?
+
+The problems of universal equality in Scala are of course well known. Some libraries have tried to fix it by adding another equality operator with more restricted typing. Most often this safer equality is written `===`. While `===` is certainly useful, I am not a fan of adding another equality operator to the language and core libraries. It would be much better if we could fix `==` instead. This would be both simpler and would catch all potential equality problems including those related to pattern matching.
+
+How can `==` be fixed? It looks much harder to do this than adding an alternate equality operator. First, we have to keep backwards compatibility. The ability to compare everything to everything is by now baked into lots of code and libraries. Second, with just one equality operator we need to make this operator work in all cases where it makes sense. An alternative `===` operator can choose to refuse some comparisons that should be valid because there's always `==` to fall back to. With a unique `==` operator we do not have this luxury.
+
+The current status in Scala is that the compiler will give warnings for _some_ comparisons that are always `false`. But the coverage is weak. For instance this will give a warning:
+
+```
+scala> 1 == "abc"
+<console>:12: warning: comparing values of types Int and String using `==' will always yield false
+```
+
+But this will not:
+
+```
+scala> "abc" == 1
+res2: Boolean = false
+```
+
+There are also cases where a warning is given for a valid equality test that actually makes sense because the result could be `true`. In summary, the current checking catches some obvious bugs, which is nice. But it is far too weak and fickle to be an effective refactoring aid.
+
+
+## What's Proposed?
+
+I believe to do better, we need to enlist the cooperation of developers. Ultimately it's the developer who provides implementations of equality methods and who is therefore best placed to characterize which equalities make sense. Sometimes this characterization can be involved. For instance, an `Int` can be compared to other primitive numeric values or to instances of type `java.lang.Number` but any other comparison will always yield `false`. Or, it makes sense to compare two `Option` values if and only if it makes sense to compare the optional element values.
+
+The best known way to characterize such relationships is with type classes. Implicit values of a trait `Eq[T, U]` can capture the property that values of type `T` can be compared to values of type `U`. Here's the definition of `Eq`
+
+```
+package scala
+
+trait Eq[-T, -U]
+```
+
+That is, `Eq` is a pure marker trait with two type parameters and without any members. Developers can define equality classes by giving implicit `Eq` instances. Here is a simple one:
+
+```
+implicit def eqString: Eq[String, String] = Eq
+```
+
+This states that strings can be only compared to strings, not to values of other types. Here's a more complicated `Eq` instance:
+
+```
+implicit def eqOption[T, U](implicit _eq: Eq[T, U]): Eq[Option[T], Option[U]] = Eq
+```
+
+This states that `Option` values can be compared if their elements can be compared.
+
+It's foreseen that such `Eq` instances can be generated automatically. If we add an annotation `@equalityClass` to `Option` like this
+
+```
+@equalityClass class Option[+T] { ... }
+```
+
+then the `eqOption` definition above would be generated automatically in `Option`'s companion object.
+
+Given a set of `Eq` instances, the idea is that the Scala compiler will check every time it encounters a _potentially problematic_ comparison between values of types `T` and `U` that there is an implicit instance of `Eq[T, U]`. A comparison is _potentially problematic_ if it is between incompatible types. As long as `T <: U` or `U <: T` the equality could make sense because both sides can potentially be the same value.
+
+So this means we still keep universal equality as it is in Scala now - we don't have a choice here anyway, because of backwards compatibility. But we render it safe by checking that for each comparison the corresponding `Eq` instance exists.
+
+What about types for which no `Eq` instance exists? To maintain backwards compatibility, we allow comparisons of such types as well, by means of a fall-back `eqAny` instance. But we do not allow comparisons between types that have an `Eq` instance and types that have none. Details are explained in the [proposal](https://github.com/lampepfl/dotty/issues/1247).
+
+## Properties
+
+Here are some nice properties of the proposal
+
+1. It is _opt-in_. To get safe checking, developers have to annotate with `@equalityClass` classes that should allow comparisons only between their instances, or they have to define implicit `Eq` instances by hand. 2. It is backwards compatible. Without developer-provided `Eq` instances, equality works as before.
+3. It carries no run-time cost compared to universal equality. Indeed the run-time behavior of equality is not affected at all.
+4. It has no problems with parametricity, variance, or bottom types. 5. Depending on the actual `Eq` instances given, it can be very precise. That is, no comparisons that might yield `true` need to be rejected, and most comparisons that will always yield `false` are in fact rejected.
+
+The scheme effectively leads to a partition of the former universe of types into sets of types. Values with types in the same partition can be compared among themselves but values with types in different partitions cannot. An `@equalityClass` annotation on a type creates a new partition. All types that do not have any `Eq` instances (except `eqAny`, that is) form together another partition. So instead of a single _universe_ of values that can be compared to each other we get a _multiverse_ of partitions. Hence the name of the proposal: **Multiversal Equality**.
diff --git a/docs/blog/index.html b/docs/blog/index.html
new file mode 100644
index 000000000..7063378d5
--- /dev/null
+++ b/docs/blog/index.html
@@ -0,0 +1,22 @@
+---
+layout: default
+title: "Blog"
+---
+
+<h1>Blog</h1>
+
+<ul class="post-list">
+ {% for post in site.posts %}
+ <li>
+ <div>
+ <a href="{{ post.url }}">{{ post.title }}</a>
+ </div>
+ <div class="date">
+ {{ post.date | date: '%B %d, %Y' }}
+ </div>
+ <div class="excerpt">
+ {{ post.excerpt }}
+ </div>
+ </li>
+ {% endfor %}
+</ul>
diff --git a/docs/contributing/eclipse.md b/docs/contributing/eclipse.md
new file mode 100644
index 000000000..46301dc42
--- /dev/null
+++ b/docs/contributing/eclipse.md
@@ -0,0 +1,50 @@
+---
+layout: default
+title: Eclipse
+---
+
+Building Dotty with Eclipse
+===========================
+
+Build setup
+-----------
+You may need to redo these steps when the build changes.
+
+1. Run `sbt eclipse`
+2. In dotty, go to `Properties > java build path > Libraries`.
+ Remove the Scala Compiler container (currently 2.11.4) and add as an
+ external jar the latest compiler version in the Ivy cache. This is
+ currently:
+
+ ```
+ .ivy2/cache/me.d-d/scala-compiler/jars/scala-compiler-2.11.5-20160322-171045-e19b30b3cd.jar
+ ```
+
+ But that might change in the future. Or, copy the latest scala compiler from
+ the cache to a stable name and use that as external jar.
+
+3. It is recommended to change the default output folder (in `Properties > java
+ build path > Source`) to `dotty/classes` instead of `dotty/bin` because
+ `dotty/bin` is reserved for shell scripts.
+
+If you have `CLASSPATH` defined:
+
+4. Update your classpath to contain any new required external libraries to run
+ `./bin/dotc`, `./bin/doti` outside of Eclipse.
+
+5. Open the `Run Configurations` tab, and edit the `tests` configuration so
+ that it contains a `CLASSPATH` variable which reflects the current
+ `CLASSPATH`.
+
+In order for compilation errors related to `ENUM` to be resolved, make sure
+that scala-reflect 2.11.5 is on the classpath.
+
+Running the compiler Main class from Eclipse
+--------------------------------------------
+1. Navigate to `dotty.tools.dotc.Main`
+2. `Run As... > Scala Application`
+3. `Run Configurations > Main$ > Classpath > Bootstrap entries`:
+ - Add the Scala library (`Advanced...` > `Add library...` > `Scala library`)
+ - Add the Dotty classfiles (`Add projects...` > `[x] dotty`)
+4. `Run Configurations > Main$ > Arguments` and add
+ `${project_loc}/examples/hello.scala`
diff --git a/docs/contributing/getting-started.md b/docs/contributing/getting-started.md
new file mode 100644
index 000000000..92afd02f3
--- /dev/null
+++ b/docs/contributing/getting-started.md
@@ -0,0 +1,42 @@
+---
+layout: default
+title: "Getting Started"
+---
+
+Getting Started
+===============
+
+Talks on Dotty
+--------------
+- [Scala's Road Ahead](https://www.youtube.com/watch?v=GHzWqJKFCk4) by Martin Odersky [\[slides\]](http://www.slideshare.net/Odersky/scala-days-nyc-2016)
+- [Compilers are Databases](https://www.youtube.com/watch?v=WxyyJyB_Ssc) by Martin Odersky [\[slides\]](http://www.slideshare.net/Odersky/compilers-are-databases)
+- [Dotty: Exploring the future of Scala](https://www.youtube.com/watch?v=aftdOFuVU1o) by Dmitry Petrashko [\[slides\]](https://d-d.me/scalaworld2015/#/). This talk includes details about the design of mini-phases and denotations.
+- [Making your Scala applications smaller and faster with the Dotty linker](https://www.youtube.com/watch?v=xCeI1ArdXM4) by Dmitry Petrashko [\[slides\]](https://d-d.me/scaladays2015/#/)
+- [Dotty: what is it and how it works](https://www.youtube.com/watch?v=wCFbYu7xEJA) by Guillaume Martres [\[slides\]](http://guillaume.martres.me/talks/dotty-tutorial/#/)
+- [Hacking on Dotty: A live demo](https://www.youtube.com/watch?v=0OOYGeZLHs4) by Guillaume Martres [\[slides\]](http://guillaume.martres.me/talks/dotty-live-demo/)
+- [AutoSpecialization in Dotty](https://vimeo.com/165928176) by Dmitry Petrashko [\[slides\]](https://d-d.me/talks/flatmap2016/#/)
+- [Dotty and types: the story so far](https://www.youtube.com/watch?v=YIQjfCKDR5A) by Guillaume Martres [\[slides\]](http://guillaume.martres.me/talks/typelevel-summit-oslo/)
+
+Requirements
+------------
+Make sure that you are using Java 8 or later, the output of `java -version`
+should contain `1.8`.
+
+Compiling and running code
+--------------------------
+```bash
+git clone https://github.com/lampepfl/dotty.git
+cd dotty
+# Clone dotty-compatible stdlib. Needed for running the test suite.
+git clone -b dotty-library https://github.com/DarkDimius/scala.git scala-scala
+# Compile code using Dotty
+./bin/dotc tests/pos/HelloWorld.scala
+# Run it with the proper classpath
+./bin/dotr HelloWorld
+```
+
+Starting a REPL
+---------------
+```bash
+./bin/dotr
+```
diff --git a/docs/contributing/intellij-idea.md b/docs/contributing/intellij-idea.md
new file mode 100644
index 000000000..dda04f515
--- /dev/null
+++ b/docs/contributing/intellij-idea.md
@@ -0,0 +1,36 @@
+---
+layout: default
+---
+
+Building Dotty with Intellij IDEA
+=================================
+Dotty compiler support is available in the [Scala plugin nightly] starting
+from 2.2.39. You need to install [IDEA 2016.1] to try it.
+
+## To create a new project with Dotty
+
+1. Open New Project dialog and select `Scala` > `Dotty`
+2. Proceed as usual and don't forget to create or select Dotty SDK.
+
+## To compile an existing Scala project with Dotty
+
+1. Create a new Dotty SDK:
+ `Project Structure` > `Global libraries` > `New Global Library` > `Dotty SDK`
+2. Replace Scala SDK with Dotty SDK in:
+ `Project Structure` > `Modules` > `Dependencies`
+
+Java 1.8 should be used as the Project/Module SDK. You also need to enable the
+Scala Compile Server to use Dotty compiler.
+
+## Notes
+* Dotty support is experimental, many features including code highlighting and
+ worksheet are not ready yet.
+* You can download the latest version of Dotty without creating a new Dotty SDK
+ with the `Update snapshot` button in the Dotty SDK library settings.
+* Please report any problems to the [IntelliJ Scala issue tracker] or write
+ to the [IntelliJ Scala gitter]
+
+[Scala plugin nightly]: https://confluence.jetbrains.com/display/SCA/Scala+Plugin+Nightly
+[IDEA 2016.1]: https://www.jetbrains.com/idea/nextversion/
+[IntelliJ Scala issue tracker]: https://youtrack.jetbrains.com/issues/SCL
+[IntelliJ Scala gitter]: https://gitter.im/JetBrains/intellij-scala
diff --git a/docs/contributing/workflow.md b/docs/contributing/workflow.md
new file mode 100644
index 000000000..e160999d9
--- /dev/null
+++ b/docs/contributing/workflow.md
@@ -0,0 +1,81 @@
+---
+layout: default
+title: "Workflow"
+---
+
+Workflow
+========
+This document details common workflow patterns when working with Dotty.
+
+## Compiling files with dotc ##
+
+From sbt:
+
+```bash
+> run <OPTIONS> <FILE>
+```
+
+From terminal:
+
+```bash
+$ ./bin/dotc <OPTIONS> <FILE>
+```
+
+Here are some useful debugging `<OPTIONS>`:
+
+* `-Xprint:PHASE1,PHASE2,...` or `-Xprint:all`: prints the `AST` after each
+ specified phase. Phase names can be found by searching
+ `src/dotty/tools/dotc/transform/` for `phaseName`.
+* `-Ylog:PHASE1,PHASE2,...` or `-Ylog:all`: enables `ctx.log("")` logging for
+ the specified phase.
+* `-Ycheck:all` verifies the consistency of `AST` nodes between phases, in
+ particular checks that types do not change. Some phases currently can't be
+ `Ycheck`ed, therefore in the tests we run:
+ `-Ycheck:tailrec,resolveSuper,mixin,restoreScopes,labelDef`.
+
+Additional logging information can be obtained by changes some `noPrinter` to
+`new Printer` in `src/dotty/tools/dotc/config/Printers.scala`. This enables the
+`subtyping.println("")` and `ctx.traceIndented("", subtyping)` style logging.
+
+## Running tests ##
+
+```bash
+$ sbt
+> partest --show-diff --verbose
+```
+
+## Running single tests ##
+To test a specific test tests/x/y.scala (for example tests/pos/t210.scala):
+
+```bash
+> partest-only-no-bootstrap --show-diff --verbose tests/partest-generated/x/y.scala
+```
+
+Currently this will re-run some tests and do some preprocessing because of the
+way partest has been set up.
+
+## Inspecting Trees with Type Stealer ##
+
+There is no power mode for the REPL yet, but you can inspect types with the
+type stealer:
+
+```bash
+$ ./bin/dotr
+scala> import test.DottyTypeStealer._; import dotty.tools.dotc.core._; import Contexts._,Types._
+```
+
+Now, you can define types and access their representation. For example:
+
+```scala
+scala> val s = stealType("class O { type X }", "O#X")
+scala> implicit val ctx: Context = s._1
+scala> val t = s._2(0)
+t: dotty.tools.dotc.core.Types.Type = TypeRef(TypeRef(ThisType(TypeRef(NoPrefix,<empty>)),O),X)
+scala> val u = t.asInstanceOf[TypeRef].underlying
+u: dotty.tools.dotc.core.Types.Type = TypeBounds(TypeRef(ThisType(TypeRef(NoPrefix,scala)),Nothing), TypeRef(ThisType(TypeRef(NoPrefix,scala)),Any))
+```
+
+## Pretty-printing ##
+Many objects in the dotc compiler implement a `Showable` trait (e.g. `Tree`,
+`Symbol`, `Type`). These objects may be prettyprinted using the `.show`
+method
diff --git a/docs/css/main.scss b/docs/css/main.scss
new file mode 100644
index 000000000..bbc780dd1
--- /dev/null
+++ b/docs/css/main.scss
@@ -0,0 +1,238 @@
+---
+# Only the main Sass file needs front matter (the dashes are enough)
+---
+@charset "utf-8";
+
+@import 'https://fonts.googleapis.com/css?family=Source+Code+Pro';
+
+// Our variables
+$base-font-family: "Helvetica Neue", Helvetica, Arial, sans-serif;
+$base-font-size: 16px;
+$code-font-family: 'Source Code Pro', sans-serif;
+$base-font-weight: 400;
+$small-font-size: $base-font-size * 0.875;
+$base-line-height: 1.5;
+
+$spacing-unit: 30px;
+
+$text-color: #111;
+$background-color: #fdfdfd;
+$brand-color: #2a7ae2;
+
+
+$blue: #3498db;
+$blue-light: rgba(52, 152, 219, 0.12);
+$grey: #f8f8f8;
+$red: #de332e;
+
+// content area
+$distance-top: 80px;
+$content-width: 1150px;
+$on-palm: 600px;
+$on-laptop: 800px;
+$toc-width: 220px;
+// Minima also includes a mixin for defining media queries.
+// Use media queries like this:
+// @include media-query($on-palm) {
+// .wrapper {
+// padding-right: $spacing-unit / 2;
+// padding-left: $spacing-unit / 2;
+// }
+// }
+@import "minima";
+
+html {
+ box-sizing: border-box;
+}
+
+*, *:before, *:after {
+ box-sizing: inherit;
+}
+
+div#container {
+ position: relative;
+ max-width: $content-width;
+ margin: 0 auto;
+ overflow: hidden;
+
+ > div#scala-logo-mobile {
+ display: none;
+ }
+
+ div#content {
+ margin-top: $distance-top;
+ width: $content-width - $toc-width;
+ float: right;
+ display: inline;
+
+ > h1.title {
+ margin-bottom: 0;
+ }
+
+ > h2.subtitle {
+ font-size: 20px;
+ }
+ }
+
+ div#toc {
+ float: left;
+ margin-top: $distance-top;
+ padding-right: 20px;
+ width: $toc-width;
+
+ > div {
+ position: fixed;
+ top: $distance-top;
+ bottom: auto;
+
+ div#scala-logo {
+ width: 64px;
+ margin-bottom: 25px;
+
+ > svg path {
+ fill: $red;
+ }
+ }
+
+ ul#categories {
+ list-style-type: none;
+ margin: 0;
+ padding: 0;
+ background-color: transparent;
+
+ > li {
+ border-right: 2px solid transparent;
+ margin-bottom: 15px;
+
+ ul {
+ list-style-type: none;
+ margin: 0;
+ padding: 0;
+ background: transparent;
+
+ li:hover {
+ border-right: 3px solid $blue;
+ background-color: $blue-light;
+
+ a:link, a:visited, a:hover, a:focus {
+ text-decoration: none;
+ }
+ }
+ }
+ }
+ }
+ }
+ }
+}
+
+div.author-container {
+ height: 50px;
+ margin-bottom: 15px;
+ > img {
+ float: left;
+ width: 100px;
+ border-radius: 50%;
+ }
+
+ > div.author-info {
+ color: rgba(0,0,0,0.45);
+ float: left;
+ }
+}
+
+div.author-container.spaced {
+ height: 100px;
+
+ > div.author-info {
+ margin: 28px 0 0 20px;
+ }
+}
+
+ul.post-list {
+ > li+li {
+ margin-top: 15px;
+ }
+
+ > li {
+ div.date {
+ color: rgba(0,0,0,0.45);
+ }
+ }
+}
+
+pre, code {
+ padding: 0;
+ border: 0;
+ border-radius: 3px;
+ background-color: $grey;
+ font-family: $code-font-family;
+}
+
+body {
+ font: 400 16px/1.5 "Helvetica Neue", Helvetica, Arial, sans-serif;
+ color: #111;
+}
+
+@include media-query(1166px) {
+ div#container {
+ max-width: 1000px;
+
+ div#content {
+ width: 1000px - $toc-width;
+ }
+ }
+}
+
+@include media-query(1016px) {
+ div#container {
+ max-width: 900px;
+
+ div#content {
+ width: 900px - $toc-width;
+ }
+ }
+}
+
+@include media-query(915px) {
+ body {
+ min-width: 0;
+ }
+
+ div#container {
+ max-width: none;
+ padding: 12px;
+
+ > div#scala-logo-mobile {
+ display: block;
+ width: 64px;
+ margin: 15px auto 0;
+
+ > svg path {
+ fill: $red;
+ }
+ }
+
+ div#toc {
+ float: none;
+ width: 100%;
+ height: auto;
+ margin-top: 0;
+
+ > div {
+ top: 0;
+ position: relative;
+
+ svg {
+ display: none;
+ top: auto;
+ }
+ }
+ }
+ div#content {
+ float: none;
+ width: 100%;
+ height: auto;
+ max-width: none;
+ }
+ }
+}
diff --git a/docs/dotc-internals/overall-structure.md b/docs/dotc-internals/overall-structure.md
deleted file mode 100644
index a80c35b4c..000000000
--- a/docs/dotc-internals/overall-structure.md
+++ /dev/null
@@ -1,174 +0,0 @@
-# Dotc's Overall Structure
-
-The compiler code is found in package [dotty.tools](https://github.com/lampepfl/dotty/tree/master/src/dotty/tools). It spans the
-following three sub-packages:
-
- backend Compiler backends (currently for JVM and JS)
- dotc The main compiler
- io Helper modules for file access and classpath handling.
-
-The [dotc](https://github.com/lampepfl/dotty/tree/master/src/dotty/tools/dotc)
-package contains some main classes that can be run as separate
-programs. The most important one is class
-[Main](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Main.scala).
-`Main` inherits from
-[Driver](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Driver.scala) which
-contains the highest level functions for starting a compiler and processing some sources.
-`Driver` in turn is based on two other high-level classes,
-[Compiler](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Compiler.scala) and
-[Run](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Run.scala).
-
-## Package Structure
-
-Most functionality of `dotc` is implemented in subpackages of `dotc`. Here's a list of sub-packages
-and their focus.
-
- ast Abstract syntax trees,
- config Compiler configuration, settings, platform specific definitions.
- core Core data structures and operations, with specific subpackages for:
-
- core.classfile Reading of Java classfiles into core data structures
- core.tasty Reading and writing of TASTY files to/from core data structures
- core.unpickleScala2 Reading of Scala2 symbol information into core data structures
-
- parsing Scanner and parser
- printing Pretty-printing trees, types and other data
- repl The interactive REPL
- reporting Reporting of error messages, warnings and other info.
- rewrite Helpers for rewriting Scala 2's constructs into dotty's.
- transform Miniphases and helpers for tree transformations.
- typer Type-checking and other frontend phases
- util General purpose utility classes and modules.
-
-## Contexts
-
-`dotc` has almost no global state (the only significant bit of global state is the name table,
-which is used to hash strings into unique names). Instead, all essential bits of information that
-can vary over a compiler run are collected in a
-[Context](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Contexts.scala).
-Most methods in `dotc` take a Context value as an implicit parameter.
-
-Contexts give a convenient way to customize values in some part of the
-call-graph. To run, e.g. some compiler function `f` at a given
-phase `phase`, we invoke `f` with an explicit context parameter, like
-this
-
- f(/*normal args*/)(ctx.withPhase(phase))
-
-This assumes that `f` is defined in the way most compiler functions are:
-
- def f(/*normal parameters*/)(implicit ctx: Context) ...
-
-Compiler code follows the convention that all implicit `Context`
-parameters are named `ctx`. This is important to avoid implicit
-ambiguities in the case where nested methods contain each a Context
-parameters. The common name ensures then that the implicit parameters
-properly shadow each other.
-
-Sometimes we want to make sure that implicit contexts are not captured
-in closures or other long-lived objects, be it because we want to
-enforce that nested methods each get their own implicit context, or
-because we want to avoid a space leak in the case where a closure can
-survive several compiler runs. A typical case is a completer for a
-symbol representing an external class, which produces the attributes
-of the symbol on demand, and which might never be invoked. In that
-case we follow the convention that any context parameter is explicit,
-not implicit, so we can track where it is used, and that it has a name
-different from `ctx`. Commonly used is `ictx` for "initialization
-context".
-
-With these two conventions in place, it has turned out that implicit
-contexts work amazingly well as a device for dependency injection and
-bulk parameterization. There is of course always the danger that
-an unexpected implicit will be passed, but in practice this has not turned out to
-be much of a problem.
-
-## Compiler Phases
-
-Seen from a temporal perspective, the `dotc` compiler consists of a list of phases.
-The current list of phases is specified in class [Compiler](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Compiler.scala) as follows:
-
-```scala
- def phases: List[List[Phase]] = List(
- List(new FrontEnd), // Compiler frontend: scanner, parser, namer, typer
- List(new PostTyper), // Additional checks and cleanups after type checking
- List(new Pickler), // Generate TASTY info
- List(new FirstTransform, // Some transformations to put trees into a canonical form
- new CheckReentrant), // Internal use only: Check that compiled program has no data races involving global vars
- List(new RefChecks, // Various checks mostly related to abstract members and overriding
- new CheckStatic, // Check restrictions that apply to @static members
- new ElimRepeated, // Rewrite vararg parameters and arguments
- new NormalizeFlags, // Rewrite some definition flags
- new ExtensionMethods, // Expand methods of value classes with extension methods
- new ExpandSAMs, // Expand single abstract method closures to anonymous classes
- new TailRec, // Rewrite tail recursion to loops
- new LiftTry, // Put try expressions that might execute on non-empty stacks into their own methods
- new ClassOf), // Expand `Predef.classOf` calls.
- List(new PatternMatcher, // Compile pattern matches
- new ExplicitOuter, // Add accessors to outer classes from nested ones.
- new ExplicitSelf, // Make references to non-trivial self types explicit as casts
- new CrossCastAnd, // Normalize selections involving intersection types.
- new Splitter), // Expand selections involving union types into conditionals
- List(new VCInlineMethods, // Inlines calls to value class methods
- new SeqLiterals, // Express vararg arguments as arrays
- new InterceptedMethods, // Special handling of `==`, `|=`, `getClass` methods
- new Getters, // Replace non-private vals and vars with getter defs (fields are added later)
- new ElimByName, // Expand by-name parameters and arguments
- new AugmentScala2Traits, // Expand traits defined in Scala 2.11 to simulate old-style rewritings
- new ResolveSuper), // Implement super accessors and add forwarders to trait methods
- List(new Erasure), // Rewrite types to JVM model, erasing all type parameters, abstract types and refinements.
- List(new ElimErasedValueType, // Expand erased value types to their underlying implementation types
- new VCElideAllocations, // Peep-hole optimization to eliminate unnecessary value class allocations
- new Mixin, // Expand trait fields and trait initializers
- new LazyVals, // Expand lazy vals
- new Memoize, // Add private fields to getters and setters
- new LinkScala2ImplClasses, // Forward calls to the implementation classes of traits defined by Scala 2.11
- new NonLocalReturns, // Expand non-local returns
- new CapturedVars, // Represent vars captured by closures as heap objects
- new Constructors, // Collect initialization code in primary constructors
- // Note: constructors changes decls in transformTemplate, no InfoTransformers should be added after it
- new FunctionalInterfaces,// Rewrites closures to implement @specialized types of Functions.
- new GetClass), // Rewrites getClass calls on primitive types.
- List(new LambdaLift, // Lifts out nested functions to class scope, storing free variables in environments
- // Note: in this mini-phase block scopes are incorrect. No phases that rely on scopes should be here
- new ElimStaticThis, // Replace `this` references to static objects by global identifiers
- new Flatten, // Lift all inner classes to package scope
- new RestoreScopes), // Repair scopes rendered invalid by moving definitions in prior phases of the group
- List(new ExpandPrivate, // Widen private definitions accessed from nested classes
- new CollectEntryPoints, // Find classes with main methods
- new LabelDefs), // Converts calls to labels to jumps
- List(new GenSJSIR), // Generate .js code
- List(new GenBCode) // Generate JVM bytecode
- )
-```
-
-Note that phases are grouped, so the `phases` method is of type
-`List[List[Phase]]`. The idea is that all phases in a group are
-*fused* into a single tree traversal. That way, phases can be kept
-small (most phases perform a single function) without requiring an
-excessive number of tree traversals (which are costly, because they
-have generally bad cache locality).
-
-Phases fall into four categories:
-
- - Frontend phases: `Frontend`, `PostTyper` and `Pickler`. `FrontEnd` parses the source programs and generates
- untyped abstract syntax trees, which are then typechecked and transformed into typed abstract syntax trees.
- `PostTyper` performs checks and cleanups that require a fully typed program. In particular, it
-
- - creates super accessors representing `super` calls in traits
- - creates implementations of synthetic (compiler-implemented) methods
- - avoids storing parameters passed unchanged from subclass to superclass in duplicate fields.
-
- Finally `Pickler` serializes the typed syntax trees produced by the frontend as TASTY data structures.
-
- - High-level transformations: All phases from `FirstTransform` to `Erasure`. Most of these phases transform
- syntax trees, expanding high-level constructs to more primitive ones. The last phase in the group, `Erasure`
- translates all types into types supported directly by the JVM. To do this, it performs another type checking
- pass, but using the rules of the JVM's type system instead of Scala's.
-
- - Low-level transformations: All phases from `ElimErasedValueType` to `LabelDefs`. These
- further transform trees until they are essentially a structured version of Java bytecode.
-
- - Code generators: These map the transformed trees to Java classfiles or Javascript files.
-
-
diff --git a/docs/dotc-internals/periods.md b/docs/dotc-internals/periods.md
deleted file mode 100644
index a616ba8a8..000000000
--- a/docs/dotc-internals/periods.md
+++ /dev/null
@@ -1,94 +0,0 @@
-# Dotc's concept of time
-
-Conceptually, the `dotc` compiler's job is to maintain views of
-various artifacts associated with source code at all points in time.
-But what is *time* for `dotc`? In fact, it is a combination of
-compiler runs and compiler phases.
-
-The *hours* of the compiler's clocks are measured in compiler
-[runs](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Run.scala). Every
-run creates a new hour, which follows all the compiler runs (hours) that
-happened before. `dotc` is designed to be used as an incremental
-compiler that can support incremental builds, as well as interactions
-in an IDE and a REPL. This means that new runs can occur quite
-frequently. At the extreme, every keystroke in an editor or REPL can
-potentially launch a new compiler run, so potentially an "hour" of
-compiler time might take only a fraction of a second in real time.
-
-The *minutes* of the compiler's clocks are measured in phases. At every
-compiler run, the compiler cycles through a number of
-[phases](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Phases.scala).
-The list of phases is defined in the [Compiler]object
-There are currently about 60 phases per run, so the minutes/hours
-analogy works out roughly. After every phase the view the compiler has
-of the world changes: trees are transformed, types are gradually simplified
-from Scala types to JVM types, definitions are rearranged, and so on.
-
-Many pieces in the information compiler are time-dependent. For
-instance, a Scala symbol representing a definition has a type, but
-that type will usually change as one goes from the higher-level Scala
-view of things to the lower-level JVM view. There are different ways
-to deal with this. Many compilers change the type of a symbol
-destructively according to the "current phase". Another, more
-functional approach might be to have different symbols representing
-the same definition at different phases, which each symbol carrying a
-different immutable type. `dotc` employs yet another scheme, which is
-inspired by functional reactive programming (FRP): Symbols carry not a
-single type, but a function from compiler phase to type. So the type
-of a symbol is a time-indexed function, where time ranges over
-compiler phases.
-
-Typically, the definition of a symbol or other quantity remains stable
-for a number of phases. This leads us to the concept of a
-[period](https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Periods.scala).
-Conceptually, period is an interval of some given phases in a given
-compiler run. Periods are conceptually represented by three pieces of
-information
-
- - the ID of the current run,
- - the ID of the phase starting the period
- - the number of phases in the period
-
-All three pieces of information are encoded in a value class over a 32 bit integer.
-Here's the API for class `Period`:
-
-```scala
- class Period(val code: Int) extends AnyVal {
- def runId: RunId // The run identifier of this period.
- def firstPhaseId: PhaseId // The first phase of this period
- def lastPhaseId: PhaseId // The last phase of this period
- def phaseId: PhaseId // The phase identifier of this single-phase period.
-
- def containsPhaseId(id: PhaseId): Boolean
- def contains(that: Period): Boolean
- def overlaps(that: Period): Boolean
-
- def & (that: Period): Period
- def | (that: Period): Period
- }
-```
-
-We can access the parts of a period using `runId`, `firstPhaseId`,
-`lastPhaseId`, or using `phaseId` for periods consisting only of a
-single phase. They return `RunId` or `PhaseId` values, which are
-aliases of `Int`. `containsPhaseId`, `contains` and `overlaps` test
-whether a period contains a phase or a period as a sub-interval, or
-whether the interval overlaps with another period. Finally, `&` and
-`|` produce the intersection and the union of two period intervals
-(the union operation `|` takes as `runId` the `runId` of its left
-operand, as periods spanning different `runId`s cannot be constructed.
-
-Periods are constructed using two `apply` methods:
-
-```scala
- object Period {
-
- /** The single-phase period consisting of given run id and phase id */
- def apply(rid: RunId, pid: PhaseId): Period }
-
- /** The period consisting of given run id, and lo/hi phase ids */
- def apply(rid: RunId, loPid: PhaseId, hiPid: PhaseId): Period
- }
-```
-
-As a sentinel value there's `Nowhere`, a period that is empty.
diff --git a/docs/images/favicon.png b/docs/images/favicon.png
new file mode 100644
index 000000000..ecd0e98fb
--- /dev/null
+++ b/docs/images/favicon.png
Binary files differ
diff --git a/docs/images/felix.jpeg b/docs/images/felix.jpeg
new file mode 100644
index 000000000..4db2642fb
--- /dev/null
+++ b/docs/images/felix.jpeg
Binary files differ
diff --git a/docs/images/fengyun.png b/docs/images/fengyun.png
new file mode 100644
index 000000000..cb7e5d19c
--- /dev/null
+++ b/docs/images/fengyun.png
Binary files differ
diff --git a/docs/images/martin.jpg b/docs/images/martin.jpg
new file mode 100644
index 000000000..c77654796
--- /dev/null
+++ b/docs/images/martin.jpg
Binary files differ
diff --git a/docs/images/nico.png b/docs/images/nico.png
new file mode 100644
index 000000000..32ebbed4a
--- /dev/null
+++ b/docs/images/nico.png
Binary files differ
diff --git a/docs/images/petrashko.png b/docs/images/petrashko.png
new file mode 100644
index 000000000..9b1a0fadb
--- /dev/null
+++ b/docs/images/petrashko.png
Binary files differ
diff --git a/docs/images/smarter.jpg b/docs/images/smarter.jpg
new file mode 100644
index 000000000..03e7ab8d3
--- /dev/null
+++ b/docs/images/smarter.jpg
Binary files differ
diff --git a/docs/index.md b/docs/index.md
new file mode 100644
index 000000000..b8c5e9c20
--- /dev/null
+++ b/docs/index.md
@@ -0,0 +1,34 @@
+---
+layout: default
+title: "Docs"
+---
+
+Dotty Documentation
+===================
+The Dotty compiler is currently somewhat lacking in documentation - PRs
+welcome! But, we've attempted to gather the most essential knowledge in these
+pages.
+
+Index
+-----
+* [Blog](blog/)
+* Usage
+ - [Migrating from Scala 2](usage/migrating.md)
+ - [Using Dotty with sbt](usage/sbt-projects.md)
+* Contributing
+ - [Getting Started](contributing/getting-started.md) details on how to run
+ tests, use the cli scripts
+ - [Workflow](contributing/workflow.md) common dev patterns and hints
+ - [Eclipse](contributing/eclipse.md) setting up dev environment
+ - [Intellij-IDEA](contributing/intellij-idea.md) setting up dev environment
+* Internals document the compiler internals
+ - [Project Structure](internals/overall-structure.md)
+ of the project
+ - [Backend](internals/backend.md) details on the bytecode backend
+ - [Contexts](internals/contexts.md) details the use of `Context` in the
+ compiler
+ - [Dotty vs Scala2](internals/dotc-scalac.md)
+ - [Higher Kinded Type Scheme](internals/higher-kinded-v2.md)
+ scheme
+ - [Periods](internals/periods.md)
+ - [Type System](internals/type-system.md)
diff --git a/docs/internals/backend.md b/docs/internals/backend.md
new file mode 100644
index 000000000..1fb9bba26
--- /dev/null
+++ b/docs/internals/backend.md
@@ -0,0 +1,127 @@
+---
+layout: default
+title: "Backend Internals"
+---
+
+Backend Internals
+=================
+The code for the backend is split up by functionality and assembled in the
+objet `GenBCode`.
+
+```none
+object GenBCode --- [defines] --> PlainClassBuilder GenBCode also defines class BCodePhase, the compiler phase
+ | |
+ [extends] [extends]
+ | |
+BCodeSyncAndTry ----------------> SyncAndTryBuilder
+ | |
+BCodeBodyBuilder ----------------> PlainBodyBuilder
+ | |
+BCodeSkelBuilder ----------------> PlainSkelBuilder
+ | / | \
+ BCodeHelpers ----------------> BCClassGen BCAnnotGen ... (more components)
+ | \ \
+ | \ \-------------> helper methods
+ | \ \------------> JMirrorBuilder, JBeanInfoBuilder (uses some components, e.g. BCInnerClassGen)
+ | \
+ | BytecodeWriters ---------> methods and classes to write byte code files
+ |
+ BCodeTypes ----------------> maps and fields for common BTypes, class Tracked, methods to collect information on classes, tests for BTypes (conforms), ...
+ |
+BCodeIdiomatic ----------------> utilities for code generation, e.g. genPrimitiveArithmetic
+ |
+ BCodeGlue ----------------> BType class, predefined BTypes
+```
+
+### Data Flow ###
+Compiler creates a `BCodePhase`, calls `runOn(compilationUnits)`.
+ * initializes fields of `GenBCode` defined in `BCodeTypes` (BType maps,
+ common BTypes like `StringReference`)
+ * initialize `primitives` map defined in `scalaPrimitives` (maps primitive
+ members, like `int.+`, to bytecode instructions)
+ * creates `BytecodeWriter`, `JMirrorBuilder` and `JBeanInfoBuilder` instances
+ (on each compiler run)
+ * `buildAndSendToDisk(units)`: uses work queues, see below.
+ - `BCodePhase.addToQ1` adds class trees to `q1`
+ - `Worker1.visit` creates ASM `ClassNodes`, adds to `q2`. It creates one
+ `PlainClassBuilder` for each compilation unit.
+ - `Worker2.addToQ3` adds byte arrays (one for each class) to `q3`
+ - `BCodePhase.drainQ3` writes byte arrays to disk
+
+
+### Architecture ###
+The architecture of `GenBCode` is the same as in Scalac. It can be partitioned
+into weakly coupled components (called "subsystems" below):
+
+
+#### (a) The queue subsystem ####
+Queues mediate between processors, queues don't know what each processor does.
+
+The first queue contains AST trees for compilation units, the second queue
+contains ASM ClassNodes, and finally the third queue contains byte arrays,
+ready for serialization to disk.
+
+Currently the queue subsystem is all sequential, but as can be seen in
+http://magarciaepfl.github.io/scala/ the above design enables overlapping (a.1)
+building of ClassNodes, (a.2) intra-method optimizations, and (a.3)
+serialization to disk.
+
+This subsystem is described in detail in GenBCode.scala
+
+#### (b) Bytecode-level types, BType ####
+The previous bytecode emitter goes to great lengths to reason about
+bytecode-level types in terms of Symbols.
+
+GenBCode uses BType as a more direct representation. A BType is immutable, and
+a value class (once the rest of GenBCode is merged from
+http://magarciaepfl.github.io/scala/ ).
+
+Whether value class or not, its API is the same. That API doesn't reach into
+the type checker. Instead, each method on a BType answers a question that can
+be answered based on the BType itself. Sounds too simple to be good? It's a
+good building block, that's what it is.
+
+The internal representation of a BType is based on what the JVM uses: internal
+names (eg Ljava/lang/String; ) and method descriptors; as defined in the JVM
+spec (that's why they aren't documented in GenBCode, just read the spec).
+
+All things BType can be found in BCodeGlue.scala
+
+#### (c) Utilities offering a more "high-level" API to bytecode emission ####
+Bytecode can be emitted one opcode at a time, but there are recurring patterns
+that call for a simpler API.
+
+For example, when emitting a load-constant, a dedicated instruction exists for
+emitting load-zero. Similarly, emitting a switch can be done according to one
+of two strategies.
+
+All these utilities are encapsulated in file BCodeIdiomatic.scala. They know
+nothing about the type checker (because, just between us, they don't need to).
+
+#### (d) Mapping between type-checker types and BTypes ####
+So that (c) can remain oblivious to what AST trees contain, some bookkeepers
+are needed:
+
+ - Tracked: for a bytecode class (BType), its superclass, directly declared
+ interfaces, and inner classes.
+
+To understand how it's built, see:
+
+```scala
+final def exemplar(csym0: Symbol): Tracked = { ... }
+```
+
+Details in BCodeTypes.scala
+
+#### (e) More "high-level" utilities for bytecode emission ####
+In the spirit of BCodeIdiomatic, utilities are added in BCodeHelpers for
+emitting:
+
+- bean info class
+- mirror class and their forwarders
+- android-specific creator classes
+- annotations
+
+
+#### (f) Building an ASM ClassNode given an AST TypeDef ####
+It's done by PlainClassBuilder
diff --git a/docs/internals/benchmarks.md b/docs/internals/benchmarks.md
new file mode 100644
index 000000000..4d24ec0ff
--- /dev/null
+++ b/docs/internals/benchmarks.md
@@ -0,0 +1,5 @@
+The regression benchmark infrastructure is still under construction.
+
+A preview can be found below:
+
+- [d-d.me/tnc/dotty/web/](https://d-d.me/tnc/dotty/web/) \ No newline at end of file
diff --git a/docs/internals/classpaths.md b/docs/internals/classpaths.md
new file mode 100644
index 000000000..0038b5de0
--- /dev/null
+++ b/docs/internals/classpaths.md
@@ -0,0 +1,42 @@
+When ran from the `dotty` script, this is the classloader stack:
+
+```
+=====================================================
+class sun.misc.Launcher$AppClassLoader <= corresponds to java.class.path
+sun.misc.Launcher$AppClassLoader@591ce4fe
+file:/mnt/data-local/Work/Workspace/dev-2.11/dotty/target/scala-2.11.0-M7/dotty_2.11.0-M7-0.1-SNAPSHOT.jar:file:/home/sun/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.0-M7.jar:file:/home/sun/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.0-M7.jar
+=====================================================
+class sun.misc.Launcher$ExtClassLoader <= corresponds to sun.boot.class.path
+sun.misc.Launcher$ExtClassLoader@77fe0d66
+file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/sunpkcs11.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/localedata.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/zipfs.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/sunec.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/sunjce_provider.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/dnsns.jar
+=====================================================
+```
+
+When running from sbt or Eclipse, the classloader stack is:
+
+```
+=====================================================
+class sbt.classpath.ClasspathUtilities$$anon$1
+sbt.classpath.ClasspathUtilities$$anon$1@22a29f97
+file:/mnt/data-local/Work/Workspace/dev-2.11/dotty/target/scala-2.11.0-M7/classes/:file:/home/sun/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.0-M7.jar:file:/home/sun/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.0-M7.jar:file:/home/sun/.ivy2/cache/org.scala-lang.modules/scala-xml_2.11.0-M7/bundles/scala-xml_2.11.0-M7-1.0.0-RC7.jar
+=====================================================
+class java.net.URLClassLoader
+java.net.URLClassLoader@2167c879
+file:/home/sun/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.0-M7.jar:file:/home/sun/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.11.0-M7.jar:file:/home/sun/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.0-M7.jar:file:/home/sun/.ivy2/cache/org.scala-lang.modules/scala-xml_2.11.0-M6/bundles/scala-xml_2.11.0-M6-1.0.0-RC6.jar:file:/home/sun/.ivy2/cache/org.scala-lang.modules/scala-parser-combinators_2.11.0-M6/bundles/scala-parser-combinators_2.11.0-M6-1.0.0-RC4.jar:file:/home/sun/.ivy2/cache/jline/jline/jars/jline-2.11.jar
+=====================================================
+class xsbt.boot.BootFilteredLoader
+xsbt.boot.BootFilteredLoader@73c74402
+not a URL classloader
+=====================================================
+class sun.misc.Launcher$AppClassLoader <= corresponds to java.class.path
+sun.misc.Launcher$AppClassLoader@612dcb8c
+file:/home/sun/.sbt/.lib/0.13.0/sbt-launch.jar
+=====================================================
+class sun.misc.Launcher$ExtClassLoader <= corresponds to sun.boot.class.path
+sun.misc.Launcher$ExtClassLoader@58e862c
+file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/sunpkcs11.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/localedata.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/zipfs.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/sunec.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/sunjce_provider.jar:file:/usr/lib/jvm/java-7-oracle/jre/lib/ext/dnsns.jar
+=====================================================
+```
+Since scala/dotty only pick up `java.class.path` and `sun.boot.class.path`,
+it's clear why dotty crashes in sbt and Eclipse unless we set the boot
+classpath explicitly.
diff --git a/docs/internals/contexts.md b/docs/internals/contexts.md
new file mode 100644
index 000000000..e2111029c
--- /dev/null
+++ b/docs/internals/contexts.md
@@ -0,0 +1,55 @@
+---
+title: Contexts
+layout: default
+---
+
+Contexts
+========
+The `Context` contains the state of the compiler, for example
+ * `settings`
+ * `freshNames` (`FreshNameCreator`)
+ * `period` (run and phase id)
+ * `compilationUnit`
+ * `phase`
+ * `tree` (current tree)
+ * `typer` (current typer)
+ * `mode` (type checking mode)
+ * `typerState` (for example undetermined type variables)
+ * ...
+
+### Contexts in the typer ###
+The type checker passes contexts through all methods and adapts fields where
+necessary, e.g.
+
+```scala
+case tree: untpd.Block => typedBlock(desugar.block(tree), pt)(ctx.fresh.withNewScope)
+```
+
+A number of fields in the context are typer-specific (`mode`, `typerState`).
+
+### In other phases ###
+Other phases need a context for many things, for example to access the
+denotation of a symbols (depends on the period). However they typically don't
+need to modify / extend the context while traversing the AST. For these phases
+the context can be simply an implicit class parameter that is then available in
+all members.
+
+**Careful**: beware of memory leaks. Don't hold on to contexts in long lived
+objects.
+
+### Using contexts ###
+Nested contexts should be named `ctx` to enable implicit shadowing:
+
+```scala
+scala> class A
+
+scala> def foo(implicit a: A) { def bar(implicit b: A) { println(implicitly[A]) } }
+<console>:8: error: ambiguous implicit values:
+ both value a of type A
+ and value b of type A
+ match expected type A
+ def foo(implicit a: A) { def bar(implicit b: A) { println(implicitly[A]) } }
+
+scala> def foo(implicit a: A) { def bar(implicit a: A) { println(implicitly[A]) } }
+foo: (implicit a: A)Unit
+```
diff --git a/docs/internals/dotc-scalac.md b/docs/internals/dotc-scalac.md
new file mode 100644
index 000000000..cf668cbb8
--- /dev/null
+++ b/docs/internals/dotc-scalac.md
@@ -0,0 +1,104 @@
+---
+layout: default
+title: "Scalac vs Dotty"
+---
+
+Differences between Scalac and Dotty
+====================================
+Overview explanation how symbols, named types and denotations hang together:
+[Denotations.scala:22]
+
+### Denotation ###
+Comment with a few details: [Denotations.scala:70]
+
+A `Denotation` is the result of a name lookup during a given period
+
+* Most properties of symbols are now in the denotation (name, type, owner,
+ etc.)
+* Denotations usually have a reference to the selected symbol
+* Denotations may be overloaded (`MultiDenotation`). In this case the symbol
+ may be `NoSymbol` (the two variants have symbols).
+* Non-overloaded denotations have an `info`
+
+Denotations of methods have a signature ([Signature.scala:7]), which
+uniquely identifies overloaded methods.
+
+#### Denotation vs. SymDenotation ####
+A `SymDenotation` is an extended denotation that has symbol-specific properties
+(that may change over phases)
+* `flags`
+* `annotations`
+* `info`
+
+`SymDenotation` implements lazy types (similar to scalac). The type completer
+assigns the denotation's `info`.
+
+#### Implicit Conversion ####
+There is an implicit conversion:
+```scala
+core.Symbols.toDenot(sym: Symbol)(implicit ctx: Context): SymDenotation
+```
+
+Because the class `Symbol` is defined in the object `core.Symbols`, the
+implicit conversion does **not** need to be imported, it is part of the
+implicit scope of the type `Symbol` (check the Scala spec). However, it can
+only be applied if an implicit `Context` is in scope.
+
+### Symbol ###
+* `Symbol` instances have a `SymDenotation`
+* Most symbol properties in scalac are now in the denotation (in dotc)
+
+Most of the `isFooBar` properties in scalac don't exist anymore in dotc. Use
+flag tests instead, for example:
+
+```scala
+if (sym.isPackageClass) // scalac
+if (sym is Flags.PackageClass) // dotc (*)
+```
+
+`(*)` Symbols are implicitly converted to their denotation, see above. Each
+`SymDeotation` has flags that can be queried using the `is` method.
+
+### Flags ###
+* Flags are instances of the value class `FlagSet`, which encapsulates a
+ `Long`
+* Each flag is either valid for types, terms, or both
+
+```
+000..0001000..01
+ ^ ^^
+ flag | \
+ | valid for term
+ valid for type
+```
+
+* Example: `Module` is valid for both module values and module classes,
+ `ModuleVal` / `ModuleClass` for either of the two.
+* `flags.is(Method | Param)`: true if `flags` has either of the two
+* `flags.is(allOf(Method | Deferred))`: true if `flags` has both. `allOf`
+ creates a `FlagConjunction`, so a different overload of `is` is chosen.
+ - Careful: `flags.is(Method & Deferred)` is always true, because `Method &
+ Deferred` is empty.
+
+### Tree ###
+* Trees don't have symbols
+ - `tree.symbol` is `tree.denot.symbol`
+ - `tree.denot` is `tree.tpe.denot` where the `tpe` is a `NamdedType` (see
+ next point)
+* Subclasses of `DenotingTree` (`Template`, `ValDef`, `DefDef`, `Select`,
+ `Ident`, etc.) have a `NamedType`, which has a `denot` field. The
+ denotation has a symbol.
+ - The `denot` of a `NamedType` (prefix + name) for the current period is
+ obtained from the symbol that the type refers to. This symbol is searched
+ using `prefix.member(name)`.
+
+
+### Type ###
+ * `MethodType(paramSyms, resultType)` from scalac =>
+ `mt @ MethodType(paramNames, paramTypes)`. Result type is `mt.resultType`
+
+`@todo`
+
+[Denotations.scala:22]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Denotations.scala#L22
+[Denotations.scala:70]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Denotations.scala#L70
+[Signature.scala:7]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Signature.scala#L7
diff --git a/docs/internals/higher-kinded-v2.md b/docs/internals/higher-kinded-v2.md
new file mode 100644
index 000000000..3019e3031
--- /dev/null
+++ b/docs/internals/higher-kinded-v2.md
@@ -0,0 +1,461 @@
+---
+layout: default
+title: "Higher-Kinded Types in Dotty"
+---
+
+**This page is out of date and preserved for posterity. Please see [Implementing
+Higher-Kinded Types in
+Dotty](http://guillaume.martres.me/publications/dotty-hk.pdf) for a more up to
+date version**
+
+Higher-Kinded Types in Dotty V2
+===============================
+This note outlines how we intend to represent higher-kinded types in Dotty.
+The principal idea is to collapse the four previously disparate features of
+refinements, type parameters, existentials and higher-kinded types into just
+one: refinements of type members. All other features will be encoded using
+these refinements.
+
+The complexity of type systems tends to grow exponentially with the number of
+independent features, because there are an exponential number of possible
+feature interactions. Consequently, a reduction from 4 to 1 fundamental
+features achieves a dramatic reduction of complexity. It also adds some nice
+usablilty improvements, notably in the area of partial type application.
+
+This is a second version of the scheme which differs in a key aspect from the
+first one: Following Adriaan's idea, we use traits with type members to model
+type lambdas and type applications. This is both more general and more robust
+than the intersections with type constructor traits that we had in the first
+version.
+
+The duality
+-----------
+The core idea: A parameterized class such as
+
+```scala
+class Map[K, V]
+```
+
+is treated as equivalent to a type with type members:
+
+```scala
+class Map { type Map$K; type Map$V }
+```
+
+The type members are name-mangled (i.e. `Map$K`) so that they do not conflict
+with other members or parameters named `K` or `V`.
+
+A type-instance such as `Map[String, Int]` would then be treated as equivalent
+to:
+
+```scala
+Map { type Map$K = String; type Map$V = Int }
+```
+
+Named type parameters
+---------------------
+Type parameters can have unmangled names. This is achieved by adding the `type`
+keyword to a type parameter declaration, analogous to how `val` indicates a
+named field. For instance,
+
+```scala
+class Map[type K, type V]
+```
+
+is treated as equivalent to
+
+```scala
+class Map { type K; type V }
+```
+
+The parameters are made visible as fields.
+
+Wildcards
+---------
+A wildcard type such as `Map[_, Int]` is equivalent to:
+
+```scala
+Map { type Map$V = Int }
+```
+
+I.e. `_`'s omit parameters from being instantiated. Wildcard arguments can have
+bounds. E.g.
+
+```scala
+Map[_ <: AnyRef, Int]
+```
+
+is equivalent to:
+
+```scala
+Map { type Map$K <: AnyRef; type Map$V = Int }
+```
+
+Type parameters in the encodings
+--------------------------------
+The notion of type parameters makes sense even for encoded types, which do not
+contain parameter lists in their syntax. Specifically, the type parameters of a
+type are a sequence of type fields that correspond to parameters in the
+unencoded type. They are determined as follows.
+
+* The type parameters of a class or trait type are those parameter fields declared in the class
+ that are not yet instantiated, in the order they are given. Type parameter fields of parents
+ are not considered.
+* The type parameters of an abstract type are the type parameters of its upper bound.
+* The type parameters of an alias type are the type parameters of its right hand side.
+* The type parameters of every other type is the empty sequence.
+
+Partial applications
+--------------------
+The definition of type parameters in the previous section leads to a simple
+model of partial applications. Consider for instance:
+
+```scala
+type Histogram = Map[_, Int]
+```
+
+`Histogram` is a higher-kinded type that still has one type parameter.
+`Histogram[String]` would be a possible type instance, and it would be
+equivalent to `Map[String, Int]`.
+
+
+Modelling polymorphic type declarations
+---------------------------------------
+The partial application scheme gives us a new -- and quite elegant -- way to do
+certain higher-kinded types. But how do we interprete the poymorphic types that
+exist in current Scala?
+
+More concretely, current Scala allows us to write parameterized type
+definitions, abstract types, and type parameters. In the new scheme, only
+classes (and traits) can have parameters and these are treated as equivalent to
+type members. Type aliases and abstract types do not allow the definition of
+parameterized types so we have to interprete polymorphic type aliases and
+abstract types specially.
+
+Modelling polymorphic type aliases: simple case
+-----------------------------------------------
+A polymorphic type alias such as:
+
+```scala
+type Pair[T] = Tuple2[T, T]
+```
+
+where `Tuple2` is declared as
+
+```scala
+class Tuple2[T1, T2] ...
+```
+
+is expanded to a monomorphic type alias like this:
+
+```scala
+type Pair = Tuple2 { type Tuple2$T2 = Tuple2$T1 }
+```
+
+More generally, each type parameter of the left-hand side must appear as a type
+member of the right hand side type. Type members must appear in the same order
+as their corresponding type parameters. References to the type parameter are
+then translated to references to the type member. The type member itself is
+left uninstantiated.
+
+This technique can expand most polymorphic type aliases appearing in Scala
+codebases but not all of them. For instance, the following alias cannot be
+expanded, because the parameter type `T` is not a type member of the right-hand
+side `List[List[T]]`.
+
+```scala
+type List2[T] = List[List[T]]
+```
+
+We scanned the Scala standard library for occurrences of polymorphic type
+aliases and determined that only two occurrences could not be expanded. In
+`io/Codec.scala`:
+
+```scala
+type Configure[T] = (T => T, Boolean)
+```
+
+And in `collection/immutable/HashMap.scala`:
+
+```scala
+private type MergeFunction[A1, B1] = ((A1, B1), (A1, B1)) => (A1, B1)
+```
+
+For these cases, we use a fall-back scheme that models a parameterized alias as
+a `Lambda` type.
+
+Modelling polymorphic type aliases: general case
+------------------------------------------------
+A polymorphic type alias such as:
+
+```scala
+type List2D[T] = List[List[T]]
+```
+
+is represented as a monomorphic type alias of a type lambda. Here's the
+expanded version of the definition above:
+
+```scala
+type List2D = Lambda$I { type Apply = List[List[$hkArg$0]] }
+```
+
+Here, `Lambda$I` is a standard trait defined as follows:
+
+```scala
+trait Lambda$I[type $hkArg$0] { type +Apply }
+```
+
+The `I` suffix of the `Lambda` trait indicates that it has one invariant type
+parameter (named $hkArg$0). Other suffixes are `P` for covariant type
+parameters, and `N` for contravariant type parameters. Lambda traits can have
+more than one type parameter. For instance, here is a trait with contravariant
+and covariant type parameters:
+
+```scala
+trait Lambda$NP[type -$hkArg$0, +$hkArg1] { type +Apply } extends Lambda$IP with Lambda$NI
+```
+
+Aside: the `+` prefix in front of `Apply` indicates that `Apply` is a covariant
+type field. Dotty admits variance annotations on type members.
+
+The definition of `Lambda$NP` shows that `Lambda` traits form a subtyping
+hierarchy: Traits which have covariant or contravariant type parameters are
+subtypes of traits which don't. The supertraits of `Lambda$NP` would themselves
+be written as follows.
+
+```scala
+trait Lambda$IP[type $hkArg$0, +$hkArg1] { type +Apply } extends Lambda$II
+trait Lambda$NI[type -$hkArg$0, $hkArg1] { type +Apply } extends Lambda$II
+trait Lambda$II[type $hkArg$0, $hkArg1] { type +Apply }
+```
+
+`Lambda` traits are special in that they influence how type applications are
+expanded: If the standard type application `T[X1, ..., Xn]` leads to a subtype
+`S` of a type instance
+
+```scala
+LambdaXYZ { type Arg1 = T1; ...; type ArgN = Tn; type Apply ... }
+```
+
+where all argument fields `Arg1, ..., ArgN` are concretely defined and the
+definition of the `Apply` field may be either abstract or concrete, then the
+application is further expanded to `S # Apply`.
+
+For instance, the type instance `List2D[String]` would be expanded to
+
+```scala
+Lambda$I { type $hkArg$0 = String; type Apply = List[List[String]] } # Apply
+```
+
+which in turn simplifies to `List[List[String]]`.
+
+2nd Example: Consider the two aliases
+
+```scala
+type RMap[K, V] = Map[V, K]
+type RRMap[K, V] = RMap[V, K]
+```
+
+These expand as follows:
+
+```scala
+type RMap = Lambda$II { self1 => type Apply = Map[self1.$hkArg$1, self1.$hkArg$0] }
+type RRMap = Lambda$II { self2 => type Apply = RMap[self2.$hkArg$1, self2.$hkArg$0] }
+```
+
+Substituting the definition of `RMap` and expanding the type application gives:
+
+```scala
+type RRMap = Lambda$II { self2 => type Apply =
+ Lambda$II { self1 => type Apply = Map[self1.$hkArg$1, self1.$hkArg$0] }
+ { type $hkArg$0 = self2.$hkArg$1; type $hkArg$1 = self2.$hkArg$0 } # Apply }
+```
+
+Substituting the definitions for `self1.$hkArg${1,2}` gives:
+
+```scala
+type RRMap = Lambda$II { self2 => type Apply =
+ Lambda$II { self1 => type Apply = Map[self2.$hkArg$0, self2.$hkArg$1] }
+ { type $hkArg$0 = self2.$hkArg$1; type $hkArg$1 = self2.$hkArg$0 } # Apply }
+```
+
+Simplifiying the `# Apply` selection gives:
+
+```scala
+type RRMap = Lambda$II { self2 => type Apply = Map[self2.$hkArg$0, self2.$hkArg$1] }
+```
+
+This can be regarded as the eta-expanded version of `Map`. It has the same expansion as
+
+```scala
+type IMap[K, V] = Map[K, V]
+```
+
+Modelling higher-kinded types
+-----------------------------
+The encoding of higher-kinded types uses again the `Lambda` traits to represent
+type constructors. Consider the higher-kinded type declaration
+
+```scala
+type Rep[T]
+```
+
+We expand this to
+
+```scala
+type Rep <: Lambda$I
+```
+
+The type parameters of `Rep` are the type parameters of its upper bound, so
+`Rep` is a unary type constructor.
+
+More generally, a higher-kinded type declaration
+
+```scala
+type T[v1 X1 >: S1 <: U1, ..., vn XN >: S1 <: UN] >: SR <: UR
+```
+
+is encoded as
+
+```scala
+type T <: LambdaV1...Vn { self =>
+ type v1 $hkArg$0 >: s(S1) <: s(U1)
+ ...
+ type vn $hkArg$N >: s(SN) <: s(UN)
+ type Apply >: s(SR) <: s(UR)
+}
+```
+
+where `s` is the substitution `[XI := self.$hkArg$I | I = 1,...,N]`.
+
+If we instantiate `Rep` with a type argument, this is expanded as was explained
+before.
+
+```scala
+Rep[String]
+```
+
+would expand to
+
+```scala
+Rep { type $hkArg$0 = String } # Apply
+```
+
+If we instantiate the higher-kinded type with a concrete type constructor (i.e.
+a parameterized trait or class), we have to do one extra adaptation to make it
+work. The parameterized trait or class has to be eta-expanded so that it
+comforms to the `Lambda` bound. For instance,
+
+```scala
+type Rep = Set
+```
+
+would expand to:
+
+```scala
+type Rep = Lambda1 { type Apply = Set[$hkArg$0] }
+```
+
+Or,
+
+```scala
+type Rep = Map[String, _]
+```
+
+would expand to
+
+```scala
+type Rep = Lambda1 { type Apply = Map[String, $hkArg$0] }
+```
+
+Full example
+------------
+Consider the higher-kinded `Functor` type class
+
+```scala
+class Functor[F[_]] {
+ def map[A, B](f: A => B): F[A] => F[B]
+}
+```
+
+This would be represented as follows:
+
+```scala
+class Functor[F <: Lambda1] {
+ def map[A, B](f: A => B): F { type $hkArg$0 = A } # Apply => F { type $hkArg$0 = B } # Apply
+}
+```
+
+The type `Functor[List]` would be represented as follows
+
+```scala
+Functor {
+ type F = Lambda1 { type Apply = List[$hkArg$0] }
+}
+```
+
+Now, assume we have a value
+
+```scala
+val ml: Functor[List]
+```
+
+Then `ml.map` would have type
+
+```scala
+s(F { type $hkArg$0 = A } # Apply => F { type $hkArg$0 = B } # Apply)
+```
+
+where `s` is the substitution of `[F := Lambda1 { type Apply = List[$hkArg$0] }]`.
+This gives:
+
+```scala
+Lambda1 { type Apply = List[$hkArg$0] } { type $hkArg$0 = A } # Apply
+ => Lambda1 { type Apply = List[$hkArg$0] } { type $hkArg$0 = B } # Apply
+```
+
+This type simplifies to:
+
+```scala
+List[A] => List[B]
+```
+
+Status of `#`
+-------------
+In the scheme above we have silently assumed that `#` "does the right thing",
+i.e. that the types are well-formed and we can collapse a type alias with a `#`
+projection, thereby giving us a form of beta reduction.
+
+In Scala 2.x, this would not work, because `T#X` means `x.X forSome { val x: T
+}`. Hence, two occurrences of `Rep[Int]` say, would not be recognized to be
+equal because the existential would be opened each time afresh.
+
+In pre-existentials Scala, this would not have worked either. There, `T#X` was
+a fundamental type constructor, but was restricted to alias types or classes
+for both `T` and `X`. Roughly, `#` was meant to encode Java's inner classes.
+In Java, given the classes
+
+```scala
+class Outer { class Inner }
+class Sub1 extends Outer
+class Sub2 extends Outer
+```
+
+The types `Outer#Inner`, `Sub1#Inner` and `Sub2#Inner` would all exist and be
+regarded as equal to each other. But if `Outer` had abstract type members this
+would not work, since an abstract type member could be instantiated differently
+in `Sub1` and `Sub2`. Assuming that `Sub1#Inner = Sub2#Inner` could then lead
+to a soundness hole. To avoid soundness problems, the types in `X#Y` were
+restricted so that `Y` was (an alias of) a class type and `X` was (an alias of)
+a class type with no abstract type members.
+
+I believe we can go back to regarding `T#X` as a fundamental type constructor,
+the way it was done in pre-existential Scala, but with the following relaxed
+restriction:
+
+> In a type selection `T#x`, `T` is not allowed to have any abstract members different from `X`
+
+This would typecheck the higher-kinded types examples, because they only
+project with `# Apply` once all `$hkArg$` type members are fully instantiated.
+
+It would be good to study this rule formally, trying to verify its soundness.
diff --git a/docs/internals/overall-structure.md b/docs/internals/overall-structure.md
new file mode 100644
index 000000000..214e47aa5
--- /dev/null
+++ b/docs/internals/overall-structure.md
@@ -0,0 +1,191 @@
+---
+layout: default
+title: "Project Structure"
+---
+
+Dotc Overall Structure
+======================
+The compiler code is found in package [dotty.tools]. It spans the
+following three sub-packages:
+
+```none
+backend Compiler backends (currently for JVM and JS)
+ dotc The main compiler
+ io Helper modules for file access and classpath handling.
+```
+
+The [dotc] package contains some main classes that can be run as separate
+programs. The most important one is class [Main]. `Main` inherits from
+[Driver] which contains the highest level functions for starting a compiler
+and processing some sources. `Driver` in turn is based on two other high-level
+classes, [Compiler] and [Run].
+
+Package Structure
+-----------------
+Most functionality of `dotc` is implemented in subpackages of `dotc`. Here's a
+list of sub-packages and their focus.
+
+```none
+.
+├── ast // Abstract syntax trees
+├── config // Compiler configuration, settings, platform specific definitions.
+├── core // Core data structures and operations, with specific subpackages for:
+│   ├── classfile // Reading of Java classfiles into core data structures
+│   ├── tasty // Reading and writing of TASTY files to/from core data structures
+│   └── unpickleScala2 // Reading of Scala2 symbol information into core data structures
+├── parsing // Scanner and parser
+├── printing // Pretty-printing trees, types and other data
+├── repl // The interactive REPL
+├── reporting // Reporting of error messages, warnings and other info.
+├── rewrite // Helpers for rewriting Scala 2's constructs into dotty's.
+├── transform // Miniphases and helpers for tree transformations.
+├── typer //Type-checking and other frontend phases
+└── util // General purpose utility classes and modules.
+```
+
+Contexts
+--------
+`dotc` has almost no global state (the only significant bit of global state is
+the name table, which is used to hash strings into unique names). Instead, all
+essential bits of information that can vary over a compiler run are collected
+in a [Context]. Most methods in `dotc` take a `Context` value as an implicit
+parameter.
+
+Contexts give a convenient way to customize values in some part of the
+call-graph. To run, e.g. some compiler function `f` at a given phase `phase`,
+we invoke `f` with an explicit context parameter, like this
+
+```scala
+f(/*normal args*/)(ctx.withPhase(phase))
+```
+
+This assumes that `f` is defined in the way most compiler functions are:
+
+```scala
+def f(/*normal parameters*/)(implicit ctx: Context) ...
+```
+
+Compiler code follows the convention that all implicit `Context` parameters are
+named `ctx`. This is important to avoid implicit ambiguities in the case where
+nested methods contain each a Context parameters. The common name ensures then
+that the implicit parameters properly shadow each other.
+
+Sometimes we want to make sure that implicit contexts are not captured in
+closures or other long-lived objects, be it because we want to enforce that
+nested methods each get their own implicit context, or because we want to avoid
+a space leak in the case where a closure can survive several compiler runs. A
+typical case is a completer for a symbol representing an external class, which
+produces the attributes of the symbol on demand, and which might never be
+invoked. In that case we follow the convention that any context parameter is
+explicit, not implicit, so we can track where it is used, and that it has a
+name different from `ctx`. Commonly used is `ictx` for "initialization
+context".
+
+With these two conventions in place, it has turned out that implicit contexts
+work amazingly well as a device for dependency injection and bulk
+parameterization. There is of course always the danger that an unexpected
+implicit will be passed, but in practice this has not turned out to be much of
+a problem.
+
+Compiler Phases
+---------------
+Seen from a temporal perspective, the `dotc` compiler consists of a list of
+phases. The current list of phases is specified in class [Compiler] as follows:
+
+```scala
+ def phases: List[List[Phase]] = List(
+ List(new FrontEnd), // Compiler frontend: scanner, parser, namer, typer
+ List(new PostTyper), // Additional checks and cleanups after type checking
+ List(new Pickler), // Generate TASTY info
+ List(new FirstTransform, // Some transformations to put trees into a canonical form
+ new CheckReentrant), // Internal use only: Check that compiled program has no data races involving global vars
+ List(new RefChecks, // Various checks mostly related to abstract members and overriding
+ new CheckStatic, // Check restrictions that apply to @static members
+ new ElimRepeated, // Rewrite vararg parameters and arguments
+ new NormalizeFlags, // Rewrite some definition flags
+ new ExtensionMethods, // Expand methods of value classes with extension methods
+ new ExpandSAMs, // Expand single abstract method closures to anonymous classes
+ new TailRec, // Rewrite tail recursion to loops
+ new LiftTry, // Put try expressions that might execute on non-empty stacks into their own methods
+ new ClassOf), // Expand `Predef.classOf` calls.
+ List(new PatternMatcher, // Compile pattern matches
+ new ExplicitOuter, // Add accessors to outer classes from nested ones.
+ new ExplicitSelf, // Make references to non-trivial self types explicit as casts
+ new CrossCastAnd, // Normalize selections involving intersection types.
+ new Splitter), // Expand selections involving union types into conditionals
+ List(new VCInlineMethods, // Inlines calls to value class methods
+ new SeqLiterals, // Express vararg arguments as arrays
+ new InterceptedMethods, // Special handling of `==`, `|=`, `getClass` methods
+ new Getters, // Replace non-private vals and vars with getter defs (fields are added later)
+ new ElimByName, // Expand by-name parameters and arguments
+ new AugmentScala2Traits, // Expand traits defined in Scala 2.11 to simulate old-style rewritings
+ new ResolveSuper), // Implement super accessors and add forwarders to trait methods
+ List(new Erasure), // Rewrite types to JVM model, erasing all type parameters, abstract types and refinements.
+ List(new ElimErasedValueType, // Expand erased value types to their underlying implementation types
+ new VCElideAllocations, // Peep-hole optimization to eliminate unnecessary value class allocations
+ new Mixin, // Expand trait fields and trait initializers
+ new LazyVals, // Expand lazy vals
+ new Memoize, // Add private fields to getters and setters
+ new LinkScala2ImplClasses, // Forward calls to the implementation classes of traits defined by Scala 2.11
+ new NonLocalReturns, // Expand non-local returns
+ new CapturedVars, // Represent vars captured by closures as heap objects
+ new Constructors, // Collect initialization code in primary constructors
+ // Note: constructors changes decls in transformTemplate, no InfoTransformers should be added after it
+ new FunctionalInterfaces,// Rewrites closures to implement @specialized types of Functions.
+ new GetClass), // Rewrites getClass calls on primitive types.
+ List(new LambdaLift, // Lifts out nested functions to class scope, storing free variables in environments
+ // Note: in this mini-phase block scopes are incorrect. No phases that rely on scopes should be here
+ new ElimStaticThis, // Replace `this` references to static objects by global identifiers
+ new Flatten, // Lift all inner classes to package scope
+ new RestoreScopes), // Repair scopes rendered invalid by moving definitions in prior phases of the group
+ List(new ExpandPrivate, // Widen private definitions accessed from nested classes
+ new CollectEntryPoints, // Find classes with main methods
+ new LabelDefs), // Converts calls to labels to jumps
+ List(new GenSJSIR), // Generate .js code
+ List(new GenBCode) // Generate JVM bytecode
+ )
+```
+
+Note that phases are grouped, so the `phases` method is of type
+`List[List[Phase]]`. The idea is that all phases in a group are *fused* into a
+single tree traversal. That way, phases can be kept small (most phases perform
+a single function) without requiring an excessive number of tree traversals
+(which are costly, because they have generally bad cache locality).
+
+Phases fall into four categories:
+
+* Frontend phases: `Frontend`, `PostTyper` and `Pickler`. `FrontEnd` parses the
+ source programs and generates untyped abstract syntax trees, which are then
+ typechecked and transformed into typed abstract syntax trees. `PostTyper`
+ performs checks and cleanups that require a fully typed program. In
+ particular, it
+
+ - creates super accessors representing `super` calls in traits
+ - creates implementations of synthetic (compiler-implemented) methods
+ - avoids storing parameters passed unchanged from subclass to superclass in
+ duplicate fields.
+
+ Finally `Pickler` serializes the typed syntax trees produced by the frontend
+ as TASTY data structures.
+
+* High-level transformations: All phases from `FirstTransform` to `Erasure`.
+ Most of these phases transform syntax trees, expanding high-level constructs
+ to more primitive ones. The last phase in the group, `Erasure` translates all
+ types into types supported directly by the JVM. To do this, it performs
+ another type checking pass, but using the rules of the JVM's type system
+ instead of Scala's.
+
+* Low-level transformations: All phases from `ElimErasedValueType` to
+ `LabelDefs`. These further transform trees until they are essentially a
+ structured version of Java bytecode.
+
+* Code generators: These map the transformed trees to Java classfiles or
+ Javascript files.
+
+[dotty.tools]: https://github.com/lampepfl/dotty/tree/master/src/dotty/tools
+[dotc]: https://github.com/lampepfl/dotty/tree/master/src/dotty/tools/dotc
+[Main]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Main.scala
+[Driver]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Driver.scala
+[Compiler]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Compiler.scala
+[Run]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Run.scala
+[Context]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Contexts.scala
diff --git a/docs/internals/periods.md b/docs/internals/periods.md
new file mode 100644
index 000000000..fe788915d
--- /dev/null
+++ b/docs/internals/periods.md
@@ -0,0 +1,96 @@
+---
+layout: default
+title: "Periods"
+toc: true
+---
+
+Dotc's concept of time
+======================
+Conceptually, the `dotc` compiler's job is to maintain views of various
+artifacts associated with source code at all points in time. But what is
+*time* for `dotc`? In fact, it is a combination of compiler runs and compiler
+phases.
+
+The *hours* of the compiler's clocks are measured in compiler [runs]. Every run
+creates a new hour, which follows all the compiler runs (hours) that happened
+before. `dotc` is designed to be used as an incremental compiler that can
+support incremental builds, as well as interactions in an IDE and a REPL. This
+means that new runs can occur quite frequently. At the extreme, every
+keystroke in an editor or REPL can potentially launch a new compiler run, so
+potentially an "hour" of compiler time might take only a fraction of a second
+in real time.
+
+The *minutes* of the compiler's clocks are measured in phases. At every
+compiler run, the compiler cycles through a number of [phases]. The list of
+phases is defined in the [Compiler]object There are currently about 60 phases
+per run, so the minutes/hours analogy works out roughly. After every phase the
+view the compiler has of the world changes: trees are transformed, types are
+gradually simplified from Scala types to JVM types, definitions are rearranged,
+and so on.
+
+Many pieces in the information compiler are time-dependent. For instance, a
+Scala symbol representing a definition has a type, but that type will usually
+change as one goes from the higher-level Scala view of things to the
+lower-level JVM view. There are different ways to deal with this. Many
+compilers change the type of a symbol destructively according to the "current
+phase". Another, more functional approach might be to have different symbols
+representing the same definition at different phases, which each symbol
+carrying a different immutable type. `dotc` employs yet another scheme, which
+is inspired by functional reactive programming (FRP): Symbols carry not a
+single type, but a function from compiler phase to type. So the type of a
+symbol is a time-indexed function, where time ranges over compiler phases.
+
+Typically, the definition of a symbol or other quantity remains stable for a
+number of phases. This leads us to the concept of a [period]. Conceptually,
+period is an interval of some given phases in a given compiler run. Periods
+are conceptually represented by three pieces of information
+
+* the ID of the current run,
+* the ID of the phase starting the period
+* the number of phases in the period
+
+All three pieces of information are encoded in a value class over a 32 bit
+integer. Here's the API for class `Period`:
+
+```scala
+class Period(val code: Int) extends AnyVal {
+ def runId: RunId // The run identifier of this period.
+ def firstPhaseId: PhaseId // The first phase of this period
+ def lastPhaseId: PhaseId // The last phase of this period
+ def phaseId: PhaseId // The phase identifier of this single-phase period
+
+ def containsPhaseId(id: PhaseId): Boolean
+ def contains(that: Period): Boolean
+ def overlaps(that: Period): Boolean
+
+ def & (that: Period): Period
+ def | (that: Period): Period
+}
+```
+
+We can access the parts of a period using `runId`, `firstPhaseId`,
+`lastPhaseId`, or using `phaseId` for periods consisting only of a single
+phase. They return `RunId` or `PhaseId` values, which are aliases of `Int`.
+`containsPhaseId`, `contains` and `overlaps` test whether a period contains a
+phase or a period as a sub-interval, or whether the interval overlaps with
+another period. Finally, `&` and `|` produce the intersection and the union of
+two period intervals (the union operation `|` takes as `runId` the `runId` of
+its left operand, as periods spanning different `runId`s cannot be constructed.
+
+Periods are constructed using two `apply` methods:
+
+```scala
+object Period {
+ /** The single-phase period consisting of given run id and phase id */
+ def apply(rid: RunId, pid: PhaseId): Period
+
+ /** The period consisting of given run id, and lo/hi phase ids */
+ def apply(rid: RunId, loPid: PhaseId, hiPid: PhaseId): Period
+}
+```
+
+As a sentinel value there's `Nowhere`, a period that is empty.
+
+[runs]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/Run.scala
+[phases]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Phases.scala
+[period]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Periods.scala
diff --git a/docs/internals/type-system.md b/docs/internals/type-system.md
new file mode 100644
index 000000000..191c107cf
--- /dev/null
+++ b/docs/internals/type-system.md
@@ -0,0 +1,134 @@
+---
+layout: default
+---
+
+Type System
+===========
+The types are defined in [dotty/tools/dotc/core/Types.scala][1]
+
+## Class diagram ##
+- [PDF][2], generated with [a fork of scaladiagrams][3]
+
+## Proxy types and ground types ##
+A type which inherits `TypeProxy` is a proxy for another type accessible using
+the `underlying` method, other types are called _ground_ types and inherit
+`CachedGroundType` or `UncachedGroundType`.
+
+```
+Type -+- ProxyType --+- NamedType ----+--- TypeRef
+ | | \
+ | +- SingletonType-+-+- TermRef
+ | | |
+ | | +--- ThisType
+ | | +--- SuperType
+ | | +--- ConstantType
+ | | +--- MethodParam
+ | | +--- RefinedThis
+ | +- PolyParam
+ | +- RefinedType
+ | +- TypeBounds
+ | +- ExprType
+ | +- AnnotatedType
+ | +- TypeVar
+ |
+ +- GroundType -+- AndType
+ +- OrType
+ +- MethodType -----+- ImplicitMethodType
+ | +- JavaMethodType
+ +- PolyType
+ +- ClassInfo
+ |
+ +- NoType
+ +- NoPrefix
+ +- ErrorType
+ +- WildcardType
+
+```
+
+## Representations of types ##
+ Type | Representation
+ ------------------------- | -----------------------------
+ `p.x.type` | `TermRef(p, x)`
+ `p#T` | `TypeRef(p, T)`
+ `p.x.T` == `p.x.type#T` | `TypeRef(TermRef(p, x), T)`
+ `this.type` | `ThisType`
+ `A & B` | `AndType(A, B)`
+ `A | B` | `OrType(A, B)`
+ `=> T` | `ExprType(T)`
+ `p { refinedName }` | `RefinedType(p, refinedName)`
+ type of the value `super` | `SuperType`
+ `type T >: A <: B` | `TypeRef` with underlying type `RealTypeBounds(A, B)`
+ `type T = A` | `TypeRef` with underlying type `TypeAlias(A)`
+ `class p.C ...` | `ClassInfo(p, C, ...)`
+
+### Representation of methods ###
+```scala
+def f[A, B <: Ord[A]](x: A, y: B): Unit
+```
+is represented as:
+
+```scala
+val p = PolyType(List("A", "B"))(
+ List(TypeBounds(Nothing, Any),
+ TypeBounds(Nothing,
+ RefinedType(Ordering,
+ scala$math$Ordering$$T, TypeAlias(PolyParam(p, 0))))),
+ m)
+
+val m = MethodType(List("x", "y"),
+ List(PolyParam(p, 0), PolyParam(p, 1)))(Unit)
+```
+(This is a slightly simplified version, e.g. we write `Unit` instead of
+`TypeRef(TermRef(ThisType(TypeRef(NoPrefix,<root>)),scala),Unit)`).
+
+Note that a PolyParam refers to a type parameter using its index (here A is 0
+and B is 1).
+
+## Subtyping checks ##
+`topLevelSubType(tp1, tp2)` in [dotty/tools/dotc/core/TypeComparer.scala][4]
+checks if `tp1` is a subtype of `tp2`.
+
+### Type rebasing ###
+**FIXME**: This section is no longer accurate because
+https://github.com/lampepfl/dotty/pull/331 changed the handling of refined
+types.
+
+Consider [tests/pos/refinedSubtyping.scala][5]
+```scala
+class Test {
+
+ class C { type T; type Coll }
+
+ type T1 = C { type T = Int }
+
+ type T11 = T1 { type Coll = Set[Int] }
+
+ type T2 = C { type Coll = Set[T] }
+
+ type T22 = T2 { type T = Int }
+
+ var x: T11 = _
+ var y: T22 = _
+
+ x = y
+ y = x
+
+}
+```
+We want to do the subtyping checks recursively, since it would be nice if we
+could check if `T22 <: T11` by first checking if `T2 <: T1`. To achieve this
+recursive subtyping check, we remember that `T2#T` is really `T22#T`. This
+procedure is called rebasing and is done by storing refined names in
+`pendingRefinedBases` and looking them up using `rebase`.
+
+## Type caching ##
+TODO
+
+## Type inference via constraint solving ##
+TODO
+
+[1]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/Types.scala
+[2]: https://github.com/samuelgruetter/dotty/blob/classdiagrampdf/dotty-types.pdf
+[3]: https://github.com/samuelgruetter/scaladiagrams/tree/print-descendants
+[4]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/core/TypeComparer.scala
+[5]: https://github.com/lampepfl/dotty/blob/master/tests/pos/refinedSubtyping.scala
diff --git a/docs/js/highlight.pack.js b/docs/js/highlight.pack.js
new file mode 100644
index 000000000..ce3e795d1
--- /dev/null
+++ b/docs/js/highlight.pack.js
@@ -0,0 +1,2 @@
+/*! highlight.js v9.7.0 | BSD3 License | git.io/hljslicense */
+!function(e){var n="object"==typeof window&&window||"object"==typeof self&&self;"undefined"!=typeof exports?e(exports):n&&(n.hljs=e({}),"function"==typeof define&&define.amd&&define([],function(){return n.hljs}))}(function(e){function n(e){return e.replace(/[&<>]/gm,function(e){return I[e]})}function t(e){return e.nodeName.toLowerCase()}function r(e,n){var t=e&&e.exec(n);return t&&0===t.index}function a(e){return k.test(e)}function i(e){var n,t,r,i,o=e.className+" ";if(o+=e.parentNode?e.parentNode.className:"",t=B.exec(o))return R(t[1])?t[1]:"no-highlight";for(o=o.split(/\s+/),n=0,r=o.length;r>n;n++)if(i=o[n],a(i)||R(i))return i}function o(e,n){var t,r={};for(t in e)r[t]=e[t];if(n)for(t in n)r[t]=n[t];return r}function u(e){var n=[];return function r(e,a){for(var i=e.firstChild;i;i=i.nextSibling)3===i.nodeType?a+=i.nodeValue.length:1===i.nodeType&&(n.push({event:"start",offset:a,node:i}),a=r(i,a),t(i).match(/br|hr|img|input/)||n.push({event:"stop",offset:a,node:i}));return a}(e,0),n}function c(e,r,a){function i(){return e.length&&r.length?e[0].offset!==r[0].offset?e[0].offset<r[0].offset?e:r:"start"===r[0].event?e:r:e.length?e:r}function o(e){function r(e){return" "+e.nodeName+'="'+n(e.value)+'"'}l+="<"+t(e)+w.map.call(e.attributes,r).join("")+">"}function u(e){l+="</"+t(e)+">"}function c(e){("start"===e.event?o:u)(e.node)}for(var s=0,l="",f=[];e.length||r.length;){var g=i();if(l+=n(a.substr(s,g[0].offset-s)),s=g[0].offset,g===e){f.reverse().forEach(u);do c(g.splice(0,1)[0]),g=i();while(g===e&&g.length&&g[0].offset===s);f.reverse().forEach(o)}else"start"===g[0].event?f.push(g[0].node):f.pop(),c(g.splice(0,1)[0])}return l+n(a.substr(s))}function s(e){function n(e){return e&&e.source||e}function t(t,r){return new RegExp(n(t),"m"+(e.cI?"i":"")+(r?"g":""))}function r(a,i){if(!a.compiled){if(a.compiled=!0,a.k=a.k||a.bK,a.k){var u={},c=function(n,t){e.cI&&(t=t.toLowerCase()),t.split(" ").forEach(function(e){var t=e.split("|");u[t[0]]=[n,t[1]?Number(t[1]):1]})};"string"==typeof a.k?c("keyword",a.k):E(a.k).forEach(function(e){c(e,a.k[e])}),a.k=u}a.lR=t(a.l||/\w+/,!0),i&&(a.bK&&(a.b="\\b("+a.bK.split(" ").join("|")+")\\b"),a.b||(a.b=/\B|\b/),a.bR=t(a.b),a.e||a.eW||(a.e=/\B|\b/),a.e&&(a.eR=t(a.e)),a.tE=n(a.e)||"",a.eW&&i.tE&&(a.tE+=(a.e?"|":"")+i.tE)),a.i&&(a.iR=t(a.i)),null==a.r&&(a.r=1),a.c||(a.c=[]);var s=[];a.c.forEach(function(e){e.v?e.v.forEach(function(n){s.push(o(e,n))}):s.push("self"===e?a:e)}),a.c=s,a.c.forEach(function(e){r(e,a)}),a.starts&&r(a.starts,i);var l=a.c.map(function(e){return e.bK?"\\.?("+e.b+")\\.?":e.b}).concat([a.tE,a.i]).map(n).filter(Boolean);a.t=l.length?t(l.join("|"),!0):{exec:function(){return null}}}}r(e)}function l(e,t,a,i){function o(e,n){var t,a;for(t=0,a=n.c.length;a>t;t++)if(r(n.c[t].bR,e))return n.c[t]}function u(e,n){if(r(e.eR,n)){for(;e.endsParent&&e.parent;)e=e.parent;return e}return e.eW?u(e.parent,n):void 0}function c(e,n){return!a&&r(n.iR,e)}function g(e,n){var t=N.cI?n[0].toLowerCase():n[0];return e.k.hasOwnProperty(t)&&e.k[t]}function h(e,n,t,r){var a=r?"":y.classPrefix,i='<span class="'+a,o=t?"":C;return i+=e+'">',i+n+o}function p(){var e,t,r,a;if(!E.k)return n(B);for(a="",t=0,E.lR.lastIndex=0,r=E.lR.exec(B);r;)a+=n(B.substr(t,r.index-t)),e=g(E,r),e?(M+=e[1],a+=h(e[0],n(r[0]))):a+=n(r[0]),t=E.lR.lastIndex,r=E.lR.exec(B);return a+n(B.substr(t))}function d(){var e="string"==typeof E.sL;if(e&&!x[E.sL])return n(B);var t=e?l(E.sL,B,!0,L[E.sL]):f(B,E.sL.length?E.sL:void 0);return E.r>0&&(M+=t.r),e&&(L[E.sL]=t.top),h(t.language,t.value,!1,!0)}function b(){k+=null!=E.sL?d():p(),B=""}function v(e){k+=e.cN?h(e.cN,"",!0):"",E=Object.create(e,{parent:{value:E}})}function m(e,n){if(B+=e,null==n)return b(),0;var t=o(n,E);if(t)return t.skip?B+=n:(t.eB&&(B+=n),b(),t.rB||t.eB||(B=n)),v(t,n),t.rB?0:n.length;var r=u(E,n);if(r){var a=E;a.skip?B+=n:(a.rE||a.eE||(B+=n),b(),a.eE&&(B=n));do E.cN&&(k+=C),E.skip||(M+=E.r),E=E.parent;while(E!==r.parent);return r.starts&&v(r.starts,""),a.rE?0:n.length}if(c(n,E))throw new Error('Illegal lexeme "'+n+'" for mode "'+(E.cN||"<unnamed>")+'"');return B+=n,n.length||1}var N=R(e);if(!N)throw new Error('Unknown language: "'+e+'"');s(N);var w,E=i||N,L={},k="";for(w=E;w!==N;w=w.parent)w.cN&&(k=h(w.cN,"",!0)+k);var B="",M=0;try{for(var I,j,O=0;;){if(E.t.lastIndex=O,I=E.t.exec(t),!I)break;j=m(t.substr(O,I.index-O),I[0]),O=I.index+j}for(m(t.substr(O)),w=E;w.parent;w=w.parent)w.cN&&(k+=C);return{r:M,value:k,language:e,top:E}}catch(T){if(T.message&&-1!==T.message.indexOf("Illegal"))return{r:0,value:n(t)};throw T}}function f(e,t){t=t||y.languages||E(x);var r={r:0,value:n(e)},a=r;return t.filter(R).forEach(function(n){var t=l(n,e,!1);t.language=n,t.r>a.r&&(a=t),t.r>r.r&&(a=r,r=t)}),a.language&&(r.second_best=a),r}function g(e){return y.tabReplace||y.useBR?e.replace(M,function(e,n){return y.useBR&&"\n"===e?"<br>":y.tabReplace?n.replace(/\t/g,y.tabReplace):void 0}):e}function h(e,n,t){var r=n?L[n]:t,a=[e.trim()];return e.match(/\bhljs\b/)||a.push("hljs"),-1===e.indexOf(r)&&a.push(r),a.join(" ").trim()}function p(e){var n,t,r,o,s,p=i(e);a(p)||(y.useBR?(n=document.createElementNS("http://www.w3.org/1999/xhtml","div"),n.innerHTML=e.innerHTML.replace(/\n/g,"").replace(/<br[ \/]*>/g,"\n")):n=e,s=n.textContent,r=p?l(p,s,!0):f(s),t=u(n),t.length&&(o=document.createElementNS("http://www.w3.org/1999/xhtml","div"),o.innerHTML=r.value,r.value=c(t,u(o),s)),r.value=g(r.value),e.innerHTML=r.value,e.className=h(e.className,p,r.language),e.result={language:r.language,re:r.r},r.second_best&&(e.second_best={language:r.second_best.language,re:r.second_best.r}))}function d(e){y=o(y,e)}function b(){if(!b.called){b.called=!0;var e=document.querySelectorAll("pre code");w.forEach.call(e,p)}}function v(){addEventListener("DOMContentLoaded",b,!1),addEventListener("load",b,!1)}function m(n,t){var r=x[n]=t(e);r.aliases&&r.aliases.forEach(function(e){L[e]=n})}function N(){return E(x)}function R(e){return e=(e||"").toLowerCase(),x[e]||x[L[e]]}var w=[],E=Object.keys,x={},L={},k=/^(no-?highlight|plain|text)$/i,B=/\blang(?:uage)?-([\w-]+)\b/i,M=/((^(<[^>]+>|\t|)+|(?:\n)))/gm,C="</span>",y={classPrefix:"hljs-",tabReplace:null,useBR:!1,languages:void 0},I={"&":"&amp;","<":"&lt;",">":"&gt;"};return e.highlight=l,e.highlightAuto=f,e.fixMarkup=g,e.highlightBlock=p,e.configure=d,e.initHighlighting=b,e.initHighlightingOnLoad=v,e.registerLanguage=m,e.listLanguages=N,e.getLanguage=R,e.inherit=o,e.IR="[a-zA-Z]\\w*",e.UIR="[a-zA-Z_]\\w*",e.NR="\\b\\d+(\\.\\d+)?",e.CNR="(-?)(\\b0[xX][a-fA-F0-9]+|(\\b\\d+(\\.\\d*)?|\\.\\d+)([eE][-+]?\\d+)?)",e.BNR="\\b(0b[01]+)",e.RSR="!|!=|!==|%|%=|&|&&|&=|\\*|\\*=|\\+|\\+=|,|-|-=|/=|/|:|;|<<|<<=|<=|<|===|==|=|>>>=|>>=|>=|>>>|>>|>|\\?|\\[|\\{|\\(|\\^|\\^=|\\||\\|=|\\|\\||~",e.BE={b:"\\\\[\\s\\S]",r:0},e.ASM={cN:"string",b:"'",e:"'",i:"\\n",c:[e.BE]},e.QSM={cN:"string",b:'"',e:'"',i:"\\n",c:[e.BE]},e.PWM={b:/\b(a|an|the|are|I'm|isn't|don't|doesn't|won't|but|just|should|pretty|simply|enough|gonna|going|wtf|so|such|will|you|your|like)\b/},e.C=function(n,t,r){var a=e.inherit({cN:"comment",b:n,e:t,c:[]},r||{});return a.c.push(e.PWM),a.c.push({cN:"doctag",b:"(?:TODO|FIXME|NOTE|BUG|XXX):",r:0}),a},e.CLCM=e.C("//","$"),e.CBCM=e.C("/\\*","\\*/"),e.HCM=e.C("#","$"),e.NM={cN:"number",b:e.NR,r:0},e.CNM={cN:"number",b:e.CNR,r:0},e.BNM={cN:"number",b:e.BNR,r:0},e.CSSNM={cN:"number",b:e.NR+"(%|em|ex|ch|rem|vw|vh|vmin|vmax|cm|mm|in|pt|pc|px|deg|grad|rad|turn|s|ms|Hz|kHz|dpi|dpcm|dppx)?",r:0},e.RM={cN:"regexp",b:/\//,e:/\/[gimuy]*/,i:/\n/,c:[e.BE,{b:/\[/,e:/\]/,r:0,c:[e.BE]}]},e.TM={cN:"title",b:e.IR,r:0},e.UTM={cN:"title",b:e.UIR,r:0},e.METHOD_GUARD={b:"\\.\\s*"+e.UIR,r:0},e});hljs.registerLanguage("coffeescript",function(e){var c={keyword:"in if for while finally new do return else break catch instanceof throw try this switch continue typeof delete debugger super then unless until loop of by when and or is isnt not",literal:"true false null undefined yes no on off",built_in:"npm require console print module global window document"},n="[A-Za-z$_][0-9A-Za-z$_]*",r={cN:"subst",b:/#\{/,e:/}/,k:c},s=[e.BNM,e.inherit(e.CNM,{starts:{e:"(\\s*/)?",r:0}}),{cN:"string",v:[{b:/'''/,e:/'''/,c:[e.BE]},{b:/'/,e:/'/,c:[e.BE]},{b:/"""/,e:/"""/,c:[e.BE,r]},{b:/"/,e:/"/,c:[e.BE,r]}]},{cN:"regexp",v:[{b:"///",e:"///",c:[r,e.HCM]},{b:"//[gim]*",r:0},{b:/\/(?![ *])(\\\/|.)*?\/[gim]*(?=\W|$)/}]},{b:"@"+n},{b:"`",e:"`",eB:!0,eE:!0,sL:"javascript"}];r.c=s;var i=e.inherit(e.TM,{b:n}),t="(\\(.*\\))?\\s*\\B[-=]>",o={cN:"params",b:"\\([^\\(]",rB:!0,c:[{b:/\(/,e:/\)/,k:c,c:["self"].concat(s)}]};return{aliases:["coffee","cson","iced"],k:c,i:/\/\*/,c:s.concat([e.C("###","###"),e.HCM,{cN:"function",b:"^\\s*"+n+"\\s*=\\s*"+t,e:"[-=]>",rB:!0,c:[i,o]},{b:/[:\(,=]\s*/,r:0,c:[{cN:"function",b:t,e:"[-=]>",rB:!0,c:[o]}]},{cN:"class",bK:"class",e:"$",i:/[:="\[\]]/,c:[{bK:"extends",eW:!0,i:/[:="\[\]]/,c:[i]},i]},{b:n+":",e:":",rB:!0,rE:!0,r:0}])}});hljs.registerLanguage("ini",function(e){var b={cN:"string",c:[e.BE],v:[{b:"'''",e:"'''",r:10},{b:'"""',e:'"""',r:10},{b:'"',e:'"'},{b:"'",e:"'"}]};return{aliases:["toml"],cI:!0,i:/\S/,c:[e.C(";","$"),e.HCM,{cN:"section",b:/^\s*\[+/,e:/\]+/},{b:/^[a-z0-9\[\]_-]+\s*=\s*/,e:"$",rB:!0,c:[{cN:"attr",b:/[a-z0-9\[\]_-]+/},{b:/=/,eW:!0,r:0,c:[{cN:"literal",b:/\bon|off|true|false|yes|no\b/},{cN:"variable",v:[{b:/\$[\w\d"][\w\d_]*/},{b:/\$\{(.*?)}/}]},b,{cN:"number",b:/([\+\-]+)?[\d]+_[\d_]+/},e.NM]}]}]}});hljs.registerLanguage("xml",function(s){var e="[A-Za-z0-9\\._:-]+",t={eW:!0,i:/</,r:0,c:[{cN:"attr",b:e,r:0},{b:/=\s*/,r:0,c:[{cN:"string",endsParent:!0,v:[{b:/"/,e:/"/},{b:/'/,e:/'/},{b:/[^\s"'=<>`]+/}]}]}]};return{aliases:["html","xhtml","rss","atom","xjb","xsd","xsl","plist"],cI:!0,c:[{cN:"meta",b:"<!DOCTYPE",e:">",r:10,c:[{b:"\\[",e:"\\]"}]},s.C("<!--","-->",{r:10}),{b:"<\\!\\[CDATA\\[",e:"\\]\\]>",r:10},{b:/<\?(php)?/,e:/\?>/,sL:"php",c:[{b:"/\\*",e:"\\*/",skip:!0}]},{cN:"tag",b:"<style(?=\\s|>|$)",e:">",k:{name:"style"},c:[t],starts:{e:"</style>",rE:!0,sL:["css","xml"]}},{cN:"tag",b:"<script(?=\\s|>|$)",e:">",k:{name:"script"},c:[t],starts:{e:"</script>",rE:!0,sL:["actionscript","javascript","handlebars","xml"]}},{cN:"meta",v:[{b:/<\?xml/,e:/\?>/,r:10},{b:/<\?\w+/,e:/\?>/}]},{cN:"tag",b:"</?",e:"/?>",c:[{cN:"name",b:/[^\/><\s]+/,r:0},t]}]}});hljs.registerLanguage("markdown",function(e){return{aliases:["md","mkdown","mkd"],c:[{cN:"section",v:[{b:"^#{1,6}",e:"$"},{b:"^.+?\\n[=-]{2,}$"}]},{b:"<",e:">",sL:"xml",r:0},{cN:"bullet",b:"^([*+-]|(\\d+\\.))\\s+"},{cN:"strong",b:"[*_]{2}.+?[*_]{2}"},{cN:"emphasis",v:[{b:"\\*.+?\\*"},{b:"_.+?_",r:0}]},{cN:"quote",b:"^>\\s+",e:"$"},{cN:"code",v:[{b:"^```w*s*$",e:"^```s*$"},{b:"`.+?`"},{b:"^( {4}| )",e:"$",r:0}]},{b:"^[-\\*]{3,}",e:"$"},{b:"\\[.+?\\][\\(\\[].*?[\\)\\]]",rB:!0,c:[{cN:"string",b:"\\[",e:"\\]",eB:!0,rE:!0,r:0},{cN:"link",b:"\\]\\(",e:"\\)",eB:!0,eE:!0},{cN:"symbol",b:"\\]\\[",e:"\\]",eB:!0,eE:!0}],r:10},{b:/^\[[^\n]+\]:/,rB:!0,c:[{cN:"symbol",b:/\[/,e:/\]/,eB:!0,eE:!0},{cN:"link",b:/:\s*/,e:/$/,eB:!0}]}]}});hljs.registerLanguage("cs",function(e){var i={keyword:"abstract as base bool break byte case catch char checked const continue decimal default delegate do double else enum event explicit extern finally fixed float for foreach goto if implicit in int interface internal is lock long object operator out override params private protected public readonly ref sbyte sealed short sizeof stackalloc static string struct switch this try typeof uint ulong unchecked unsafe ushort using virtual void volatile while nameof add alias ascending async await by descending dynamic equals from get global group into join let on orderby partial remove select set value var where yield",literal:"null false true"},r={cN:"string",b:'@"',e:'"',c:[{b:'""'}]},t=e.inherit(r,{i:/\n/}),a={cN:"subst",b:"{",e:"}",k:i},n=e.inherit(a,{i:/\n/}),c={cN:"string",b:/\$"/,e:'"',i:/\n/,c:[{b:"{{"},{b:"}}"},e.BE,n]},s={cN:"string",b:/\$@"/,e:'"',c:[{b:"{{"},{b:"}}"},{b:'""'},a]},o=e.inherit(s,{i:/\n/,c:[{b:"{{"},{b:"}}"},{b:'""'},n]});a.c=[s,c,r,e.ASM,e.QSM,e.CNM,e.CBCM],n.c=[o,c,t,e.ASM,e.QSM,e.CNM,e.inherit(e.CBCM,{i:/\n/})];var l={v:[s,c,r,e.ASM,e.QSM]},b=e.IR+"(<"+e.IR+"(\\s*,\\s*"+e.IR+")*>)?(\\[\\])?";return{aliases:["csharp"],k:i,i:/::/,c:[e.C("///","$",{rB:!0,c:[{cN:"doctag",v:[{b:"///",r:0},{b:"<!--|-->"},{b:"</?",e:">"}]}]}),e.CLCM,e.CBCM,{cN:"meta",b:"#",e:"$",k:{"meta-keyword":"if else elif endif define undef warning error line region endregion pragma checksum"}},l,e.CNM,{bK:"class interface",e:/[{;=]/,i:/[^\s:]/,c:[e.TM,e.CLCM,e.CBCM]},{bK:"namespace",e:/[{;=]/,i:/[^\s:]/,c:[e.inherit(e.TM,{b:"[a-zA-Z](\\.?\\w)*"}),e.CLCM,e.CBCM]},{bK:"new return throw await",r:0},{cN:"function",b:"("+b+"\\s+)+"+e.IR+"\\s*\\(",rB:!0,e:/[{;=]/,eE:!0,k:i,c:[{b:e.IR+"\\s*\\(",rB:!0,c:[e.TM],r:0},{cN:"params",b:/\(/,e:/\)/,eB:!0,eE:!0,k:i,r:0,c:[l,e.CNM,e.CBCM]},e.CLCM,e.CBCM]}]}});hljs.registerLanguage("ruby",function(e){var b="[a-zA-Z_]\\w*[!?=]?|[-+~]\\@|<<|>>|=~|===?|<=>|[<>]=?|\\*\\*|[-/+%^&*~`|]|\\[\\]=?",r={keyword:"and then defined module in return redo if BEGIN retry end for self when next until do begin unless END rescue else break undef not super class case require yield alias while ensure elsif or include attr_reader attr_writer attr_accessor",literal:"true false nil"},c={cN:"doctag",b:"@[A-Za-z]+"},a={b:"#<",e:">"},s=[e.C("#","$",{c:[c]}),e.C("^\\=begin","^\\=end",{c:[c],r:10}),e.C("^__END__","\\n$")],n={cN:"subst",b:"#\\{",e:"}",k:r},t={cN:"string",c:[e.BE,n],v:[{b:/'/,e:/'/},{b:/"/,e:/"/},{b:/`/,e:/`/},{b:"%[qQwWx]?\\(",e:"\\)"},{b:"%[qQwWx]?\\[",e:"\\]"},{b:"%[qQwWx]?{",e:"}"},{b:"%[qQwWx]?<",e:">"},{b:"%[qQwWx]?/",e:"/"},{b:"%[qQwWx]?%",e:"%"},{b:"%[qQwWx]?-",e:"-"},{b:"%[qQwWx]?\\|",e:"\\|"},{b:/\B\?(\\\d{1,3}|\\x[A-Fa-f0-9]{1,2}|\\u[A-Fa-f0-9]{4}|\\?\S)\b/},{b:/<<(-?)\w+$/,e:/^\s*\w+$/}]},i={cN:"params",b:"\\(",e:"\\)",endsParent:!0,k:r},d=[t,a,{cN:"class",bK:"class module",e:"$|;",i:/=/,c:[e.inherit(e.TM,{b:"[A-Za-z_]\\w*(::\\w+)*(\\?|\\!)?"}),{b:"<\\s*",c:[{b:"("+e.IR+"::)?"+e.IR}]}].concat(s)},{cN:"function",bK:"def",e:"$|;",c:[e.inherit(e.TM,{b:b}),i].concat(s)},{b:e.IR+"::"},{cN:"symbol",b:e.UIR+"(\\!|\\?)?:",r:0},{cN:"symbol",b:":(?!\\s)",c:[t,{b:b}],r:0},{cN:"number",b:"(\\b0[0-7_]+)|(\\b0x[0-9a-fA-F_]+)|(\\b[1-9][0-9_]*(\\.[0-9_]+)?)|[0_]\\b",r:0},{b:"(\\$\\W)|((\\$|\\@\\@?)(\\w+))"},{cN:"params",b:/\|/,e:/\|/,k:r},{b:"("+e.RSR+")\\s*",c:[a,{cN:"regexp",c:[e.BE,n],i:/\n/,v:[{b:"/",e:"/[a-z]*"},{b:"%r{",e:"}[a-z]*"},{b:"%r\\(",e:"\\)[a-z]*"},{b:"%r!",e:"![a-z]*"},{b:"%r\\[",e:"\\][a-z]*"}]}].concat(s),r:0}].concat(s);n.c=d,i.c=d;var l="[>?]>",o="[\\w#]+\\(\\w+\\):\\d+:\\d+>",w="(\\w+-)?\\d+\\.\\d+\\.\\d(p\\d+)?[^>]+>",u=[{b:/^\s*=>/,starts:{e:"$",c:d}},{cN:"meta",b:"^("+l+"|"+o+"|"+w+")",starts:{e:"$",c:d}}];return{aliases:["rb","gemspec","podspec","thor","irb"],k:r,i:/\/\*/,c:s.concat(u).concat(d)}});hljs.registerLanguage("apache",function(e){var r={cN:"number",b:"[\\$%]\\d+"};return{aliases:["apacheconf"],cI:!0,c:[e.HCM,{cN:"section",b:"</?",e:">"},{cN:"attribute",b:/\w+/,r:0,k:{nomarkup:"order deny allow setenv rewriterule rewriteengine rewritecond documentroot sethandler errordocument loadmodule options header listen serverroot servername"},starts:{e:/$/,r:0,k:{literal:"on off all"},c:[{cN:"meta",b:"\\s\\[",e:"\\]$"},{cN:"variable",b:"[\\$%]\\{",e:"\\}",c:["self",r]},r,e.QSM]}}],i:/\S/}});hljs.registerLanguage("http",function(e){var t="HTTP/[0-9\\.]+";return{aliases:["https"],i:"\\S",c:[{b:"^"+t,e:"$",c:[{cN:"number",b:"\\b\\d{3}\\b"}]},{b:"^[A-Z]+ (.*?) "+t+"$",rB:!0,e:"$",c:[{cN:"string",b:" ",e:" ",eB:!0,eE:!0},{b:t},{cN:"keyword",b:"[A-Z]+"}]},{cN:"attribute",b:"^\\w",e:": ",eE:!0,i:"\\n|\\s|=",starts:{e:"$",r:0}},{b:"\\n\\n",starts:{sL:[],eW:!0}}]}});hljs.registerLanguage("sql",function(e){var t=e.C("--","$");return{cI:!0,i:/[<>{}*#]/,c:[{bK:"begin end start commit rollback savepoint lock alter create drop rename call delete do handler insert load replace select truncate update set show pragma grant merge describe use explain help declare prepare execute deallocate release unlock purge reset change stop analyze cache flush optimize repair kill install uninstall checksum restore check backup revoke comment",e:/;/,eW:!0,l:/[\w\.]+/,k:{keyword:"abort abs absolute acc acce accep accept access accessed accessible account acos action activate add addtime admin administer advanced advise aes_decrypt aes_encrypt after agent aggregate ali alia alias allocate allow alter always analyze ancillary and any anydata anydataset anyschema anytype apply archive archived archivelog are as asc ascii asin assembly assertion associate asynchronous at atan atn2 attr attri attrib attribu attribut attribute attributes audit authenticated authentication authid authors auto autoallocate autodblink autoextend automatic availability avg backup badfile basicfile before begin beginning benchmark between bfile bfile_base big bigfile bin binary_double binary_float binlog bit_and bit_count bit_length bit_or bit_xor bitmap blob_base block blocksize body both bound buffer_cache buffer_pool build bulk by byte byteordermark bytes cache caching call calling cancel capacity cascade cascaded case cast catalog category ceil ceiling chain change changed char_base char_length character_length characters characterset charindex charset charsetform charsetid check checksum checksum_agg child choose chr chunk class cleanup clear client clob clob_base clone close cluster_id cluster_probability cluster_set clustering coalesce coercibility col collate collation collect colu colum column column_value columns columns_updated comment commit compact compatibility compiled complete composite_limit compound compress compute concat concat_ws concurrent confirm conn connec connect connect_by_iscycle connect_by_isleaf connect_by_root connect_time connection consider consistent constant constraint constraints constructor container content contents context contributors controlfile conv convert convert_tz corr corr_k corr_s corresponding corruption cos cost count count_big counted covar_pop covar_samp cpu_per_call cpu_per_session crc32 create creation critical cross cube cume_dist curdate current current_date current_time current_timestamp current_user cursor curtime customdatum cycle data database databases datafile datafiles datalength date_add date_cache date_format date_sub dateadd datediff datefromparts datename datepart datetime2fromparts day day_to_second dayname dayofmonth dayofweek dayofyear days db_role_change dbtimezone ddl deallocate declare decode decompose decrement decrypt deduplicate def defa defau defaul default defaults deferred defi defin define degrees delayed delegate delete delete_all delimited demand dense_rank depth dequeue des_decrypt des_encrypt des_key_file desc descr descri describ describe descriptor deterministic diagnostics difference dimension direct_load directory disable disable_all disallow disassociate discardfile disconnect diskgroup distinct distinctrow distribute distributed div do document domain dotnet double downgrade drop dumpfile duplicate duration each edition editionable editions element ellipsis else elsif elt empty enable enable_all enclosed encode encoding encrypt end end-exec endian enforced engine engines enqueue enterprise entityescaping eomonth error errors escaped evalname evaluate event eventdata events except exception exceptions exchange exclude excluding execu execut execute exempt exists exit exp expire explain export export_set extended extent external external_1 external_2 externally extract failed failed_login_attempts failover failure far fast feature_set feature_value fetch field fields file file_name_convert filesystem_like_logging final finish first first_value fixed flash_cache flashback floor flush following follows for forall force form forma format found found_rows freelist freelists freepools fresh from from_base64 from_days ftp full function general generated get get_format get_lock getdate getutcdate global global_name globally go goto grant grants greatest group group_concat group_id grouping grouping_id groups gtid_subtract guarantee guard handler hash hashkeys having hea head headi headin heading heap help hex hierarchy high high_priority hosts hour http id ident_current ident_incr ident_seed identified identity idle_time if ifnull ignore iif ilike ilm immediate import in include including increment index indexes indexing indextype indicator indices inet6_aton inet6_ntoa inet_aton inet_ntoa infile initial initialized initially initrans inmemory inner innodb input insert install instance instantiable instr interface interleaved intersect into invalidate invisible is is_free_lock is_ipv4 is_ipv4_compat is_not is_not_null is_used_lock isdate isnull isolation iterate java join json json_exists keep keep_duplicates key keys kill language large last last_day last_insert_id last_value lax lcase lead leading least leaves left len lenght length less level levels library like like2 like4 likec limit lines link list listagg little ln load load_file lob lobs local localtime localtimestamp locate locator lock locked log log10 log2 logfile logfiles logging logical logical_reads_per_call logoff logon logs long loop low low_priority lower lpad lrtrim ltrim main make_set makedate maketime managed management manual map mapping mask master master_pos_wait match matched materialized max maxextents maximize maxinstances maxlen maxlogfiles maxloghistory maxlogmembers maxsize maxtrans md5 measures median medium member memcompress memory merge microsecond mid migration min minextents minimum mining minus minute minvalue missing mod mode model modification modify module monitoring month months mount move movement multiset mutex name name_const names nan national native natural nav nchar nclob nested never new newline next nextval no no_write_to_binlog noarchivelog noaudit nobadfile nocheck nocompress nocopy nocycle nodelay nodiscardfile noentityescaping noguarantee nokeep nologfile nomapping nomaxvalue nominimize nominvalue nomonitoring none noneditionable nonschema noorder nopr nopro noprom nopromp noprompt norely noresetlogs noreverse normal norowdependencies noschemacheck noswitch not nothing notice notrim novalidate now nowait nth_value nullif nulls num numb numbe nvarchar nvarchar2 object ocicoll ocidate ocidatetime ociduration ociinterval ociloblocator ocinumber ociref ocirefcursor ocirowid ocistring ocitype oct octet_length of off offline offset oid oidindex old on online only opaque open operations operator optimal optimize option optionally or oracle oracle_date oradata ord ordaudio orddicom orddoc order ordimage ordinality ordvideo organization orlany orlvary out outer outfile outline output over overflow overriding package pad parallel parallel_enable parameters parent parse partial partition partitions pascal passing password password_grace_time password_lock_time password_reuse_max password_reuse_time password_verify_function patch path patindex pctincrease pctthreshold pctused pctversion percent percent_rank percentile_cont percentile_disc performance period period_add period_diff permanent physical pi pipe pipelined pivot pluggable plugin policy position post_transaction pow power pragma prebuilt precedes preceding precision prediction prediction_cost prediction_details prediction_probability prediction_set prepare present preserve prior priority private private_sga privileges procedural procedure procedure_analyze processlist profiles project prompt protection public publishingservername purge quarter query quick quiesce quota quotename radians raise rand range rank raw read reads readsize rebuild record records recover recovery recursive recycle redo reduced ref reference referenced references referencing refresh regexp_like register regr_avgx regr_avgy regr_count regr_intercept regr_r2 regr_slope regr_sxx regr_sxy reject rekey relational relative relaylog release release_lock relies_on relocate rely rem remainder rename repair repeat replace replicate replication required reset resetlogs resize resource respect restore restricted result result_cache resumable resume retention return returning returns reuse reverse revoke right rlike role roles rollback rolling rollup round row row_count rowdependencies rowid rownum rows rtrim rules safe salt sample save savepoint sb1 sb2 sb4 scan schema schemacheck scn scope scroll sdo_georaster sdo_topo_geometry search sec_to_time second section securefile security seed segment select self sequence sequential serializable server servererror session session_user sessions_per_user set sets settings sha sha1 sha2 share shared shared_pool short show shrink shutdown si_averagecolor si_colorhistogram si_featurelist si_positionalcolor si_stillimage si_texture siblings sid sign sin size size_t sizes skip slave sleep smalldatetimefromparts smallfile snapshot some soname sort soundex source space sparse spfile split sql sql_big_result sql_buffer_result sql_cache sql_calc_found_rows sql_small_result sql_variant_property sqlcode sqldata sqlerror sqlname sqlstate sqrt square standalone standby start starting startup statement static statistics stats_binomial_test stats_crosstab stats_ks_test stats_mode stats_mw_test stats_one_way_anova stats_t_test_ stats_t_test_indep stats_t_test_one stats_t_test_paired stats_wsr_test status std stddev stddev_pop stddev_samp stdev stop storage store stored str str_to_date straight_join strcmp strict string struct stuff style subdate subpartition subpartitions substitutable substr substring subtime subtring_index subtype success sum suspend switch switchoffset switchover sync synchronous synonym sys sys_xmlagg sysasm sysaux sysdate sysdatetimeoffset sysdba sysoper system system_user sysutcdatetime table tables tablespace tan tdo template temporary terminated tertiary_weights test than then thread through tier ties time time_format time_zone timediff timefromparts timeout timestamp timestampadd timestampdiff timezone_abbr timezone_minute timezone_region to to_base64 to_date to_days to_seconds todatetimeoffset trace tracking transaction transactional translate translation treat trigger trigger_nestlevel triggers trim truncate try_cast try_convert try_parse type ub1 ub2 ub4 ucase unarchived unbounded uncompress under undo unhex unicode uniform uninstall union unique unix_timestamp unknown unlimited unlock unpivot unrecoverable unsafe unsigned until untrusted unusable unused update updated upgrade upped upper upsert url urowid usable usage use use_stored_outlines user user_data user_resources users using utc_date utc_timestamp uuid uuid_short validate validate_password_strength validation valist value values var var_samp varcharc vari varia variab variabl variable variables variance varp varraw varrawc varray verify version versions view virtual visible void wait wallet warning warnings week weekday weekofyear wellformed when whene whenev wheneve whenever where while whitespace with within without work wrapped xdb xml xmlagg xmlattributes xmlcast xmlcolattval xmlelement xmlexists xmlforest xmlindex xmlnamespaces xmlpi xmlquery xmlroot xmlschema xmlserialize xmltable xmltype xor year year_to_month years yearweek",literal:"true false null",built_in:"array bigint binary bit blob boolean char character date dec decimal float int int8 integer interval number numeric real record serial serial8 smallint text varchar varying void"},c:[{cN:"string",b:"'",e:"'",c:[e.BE,{b:"''"}]},{cN:"string",b:'"',e:'"',c:[e.BE,{b:'""'}]},{cN:"string",b:"`",e:"`",c:[e.BE]},e.CNM,e.CBCM,t]},e.CBCM,t]}});hljs.registerLanguage("perl",function(e){var t="getpwent getservent quotemeta msgrcv scalar kill dbmclose undef lc ma syswrite tr send umask sysopen shmwrite vec qx utime local oct semctl localtime readpipe do return format read sprintf dbmopen pop getpgrp not getpwnam rewinddir qqfileno qw endprotoent wait sethostent bless s|0 opendir continue each sleep endgrent shutdown dump chomp connect getsockname die socketpair close flock exists index shmgetsub for endpwent redo lstat msgctl setpgrp abs exit select print ref gethostbyaddr unshift fcntl syscall goto getnetbyaddr join gmtime symlink semget splice x|0 getpeername recv log setsockopt cos last reverse gethostbyname getgrnam study formline endhostent times chop length gethostent getnetent pack getprotoent getservbyname rand mkdir pos chmod y|0 substr endnetent printf next open msgsnd readdir use unlink getsockopt getpriority rindex wantarray hex system getservbyport endservent int chr untie rmdir prototype tell listen fork shmread ucfirst setprotoent else sysseek link getgrgid shmctl waitpid unpack getnetbyname reset chdir grep split require caller lcfirst until warn while values shift telldir getpwuid my getprotobynumber delete and sort uc defined srand accept package seekdir getprotobyname semop our rename seek if q|0 chroot sysread setpwent no crypt getc chown sqrt write setnetent setpriority foreach tie sin msgget map stat getlogin unless elsif truncate exec keys glob tied closedirioctl socket readlink eval xor readline binmode setservent eof ord bind alarm pipe atan2 getgrent exp time push setgrent gt lt or ne m|0 break given say state when",r={cN:"subst",b:"[$@]\\{",e:"\\}",k:t},s={b:"->{",e:"}"},n={v:[{b:/\$\d/},{b:/[\$%@](\^\w\b|#\w+(::\w+)*|{\w+}|\w+(::\w*)*)/},{b:/[\$%@][^\s\w{]/,r:0}]},i=[e.BE,r,n],o=[n,e.HCM,e.C("^\\=\\w","\\=cut",{eW:!0}),s,{cN:"string",c:i,v:[{b:"q[qwxr]?\\s*\\(",e:"\\)",r:5},{b:"q[qwxr]?\\s*\\[",e:"\\]",r:5},{b:"q[qwxr]?\\s*\\{",e:"\\}",r:5},{b:"q[qwxr]?\\s*\\|",e:"\\|",r:5},{b:"q[qwxr]?\\s*\\<",e:"\\>",r:5},{b:"qw\\s+q",e:"q",r:5},{b:"'",e:"'",c:[e.BE]},{b:'"',e:'"'},{b:"`",e:"`",c:[e.BE]},{b:"{\\w+}",c:[],r:0},{b:"-?\\w+\\s*\\=\\>",c:[],r:0}]},{cN:"number",b:"(\\b0[0-7_]+)|(\\b0x[0-9a-fA-F_]+)|(\\b[1-9][0-9_]*(\\.[0-9_]+)?)|[0_]\\b",r:0},{b:"(\\/\\/|"+e.RSR+"|\\b(split|return|print|reverse|grep)\\b)\\s*",k:"split return print reverse grep",r:0,c:[e.HCM,{cN:"regexp",b:"(s|tr|y)/(\\\\.|[^/])*/(\\\\.|[^/])*/[a-z]*",r:10},{cN:"regexp",b:"(m|qr)?/",e:"/[a-z]*",c:[e.BE],r:0}]},{cN:"function",bK:"sub",e:"(\\s*\\(.*?\\))?[;{]",eE:!0,r:5,c:[e.TM]},{b:"-\\w\\b",r:0},{b:"^__DATA__$",e:"^__END__$",sL:"mojolicious",c:[{b:"^@@.*",e:"$",cN:"comment"}]}];return r.c=o,s.c=o,{aliases:["pl","pm"],l:/[\w\.]+/,k:t,c:o}});hljs.registerLanguage("php",function(e){var c={b:"\\$+[a-zA-Z_-ÿ][a-zA-Z0-9_-ÿ]*"},i={cN:"meta",b:/<\?(php)?|\?>/},t={cN:"string",c:[e.BE,i],v:[{b:'b"',e:'"'},{b:"b'",e:"'"},e.inherit(e.ASM,{i:null}),e.inherit(e.QSM,{i:null})]},a={v:[e.BNM,e.CNM]};return{aliases:["php3","php4","php5","php6"],cI:!0,k:"and include_once list abstract global private echo interface as static endswitch array null if endwhile or const for endforeach self var while isset public protected exit foreach throw elseif include __FILE__ empty require_once do xor return parent clone use __CLASS__ __LINE__ else break print eval new catch __METHOD__ case exception default die require __FUNCTION__ enddeclare final try switch continue endfor endif declare unset true false trait goto instanceof insteadof __DIR__ __NAMESPACE__ yield finally",c:[e.HCM,e.C("//","$",{c:[i]}),e.C("/\\*","\\*/",{c:[{cN:"doctag",b:"@[A-Za-z]+"}]}),e.C("__halt_compiler.+?;",!1,{eW:!0,k:"__halt_compiler",l:e.UIR}),{cN:"string",b:/<<<['"]?\w+['"]?$/,e:/^\w+;?$/,c:[e.BE,{cN:"subst",v:[{b:/\$\w+/},{b:/\{\$/,e:/\}/}]}]},i,{cN:"keyword",b:/\$this\b/},c,{b:/(::|->)+[a-zA-Z_\x7f-\xff][a-zA-Z0-9_\x7f-\xff]*/},{cN:"function",bK:"function",e:/[;{]/,eE:!0,i:"\\$|\\[|%",c:[e.UTM,{cN:"params",b:"\\(",e:"\\)",c:["self",c,e.CBCM,t,a]}]},{cN:"class",bK:"class interface",e:"{",eE:!0,i:/[:\(\$"]/,c:[{bK:"extends implements"},e.UTM]},{bK:"namespace",e:";",i:/[\.']/,c:[e.UTM]},{bK:"use",e:";",c:[e.UTM]},{b:"=>"},t,a]}});hljs.registerLanguage("json",function(e){var i={literal:"true false null"},n=[e.QSM,e.CNM],r={e:",",eW:!0,eE:!0,c:n,k:i},t={b:"{",e:"}",c:[{cN:"attr",b:/"/,e:/"/,c:[e.BE],i:"\\n"},e.inherit(r,{b:/:/})],i:"\\S"},c={b:"\\[",e:"\\]",c:[e.inherit(r)],i:"\\S"};return n.splice(n.length,0,t,c),{c:n,k:i,i:"\\S"}});hljs.registerLanguage("cpp",function(t){var e={cN:"keyword",b:"\\b[a-z\\d_]*_t\\b"},r={cN:"string",v:[{b:'(u8?|U)?L?"',e:'"',i:"\\n",c:[t.BE]},{b:'(u8?|U)?R"',e:'"',c:[t.BE]},{b:"'\\\\?.",e:"'",i:"."}]},s={cN:"number",v:[{b:"\\b(0b[01']+)"},{b:"\\b([\\d']+(\\.[\\d']*)?|\\.[\\d']+)(u|U|l|L|ul|UL|f|F|b|B)"},{b:"(-?)(\\b0[xX][a-fA-F0-9']+|(\\b[\\d']+(\\.[\\d']*)?|\\.[\\d']+)([eE][-+]?[\\d']+)?)"}],r:0},i={cN:"meta",b:/#\s*[a-z]+\b/,e:/$/,k:{"meta-keyword":"if else elif endif define undef warning error line pragma ifdef ifndef include"},c:[{b:/\\\n/,r:0},t.inherit(r,{cN:"meta-string"}),{cN:"meta-string",b:"<",e:">",i:"\\n"},t.CLCM,t.CBCM]},a=t.IR+"\\s*\\(",c={keyword:"int float while private char catch import module export virtual operator sizeof dynamic_cast|10 typedef const_cast|10 const struct for static_cast|10 union namespace unsigned long volatile static protected bool template mutable if public friend do goto auto void enum else break extern using class asm case typeid short reinterpret_cast|10 default double register explicit signed typename try this switch continue inline delete alignof constexpr decltype noexcept static_assert thread_local restrict _Bool complex _Complex _Imaginary atomic_bool atomic_char atomic_schar atomic_uchar atomic_short atomic_ushort atomic_int atomic_uint atomic_long atomic_ulong atomic_llong atomic_ullong new throw return",built_in:"std string cin cout cerr clog stdin stdout stderr stringstream istringstream ostringstream auto_ptr deque list queue stack vector map set bitset multiset multimap unordered_set unordered_map unordered_multiset unordered_multimap array shared_ptr abort abs acos asin atan2 atan calloc ceil cosh cos exit exp fabs floor fmod fprintf fputs free frexp fscanf isalnum isalpha iscntrl isdigit isgraph islower isprint ispunct isspace isupper isxdigit tolower toupper labs ldexp log10 log malloc realloc memchr memcmp memcpy memset modf pow printf putchar puts scanf sinh sin snprintf sprintf sqrt sscanf strcat strchr strcmp strcpy strcspn strlen strncat strncmp strncpy strpbrk strrchr strspn strstr tanh tan vfprintf vprintf vsprintf endl initializer_list unique_ptr",literal:"true false nullptr NULL"},n=[e,t.CLCM,t.CBCM,s,r];return{aliases:["c","cc","h","c++","h++","hpp"],k:c,i:"</",c:n.concat([i,{b:"\\b(deque|list|queue|stack|vector|map|set|bitset|multiset|multimap|unordered_map|unordered_set|unordered_multiset|unordered_multimap|array)\\s*<",e:">",k:c,c:["self",e]},{b:t.IR+"::",k:c},{v:[{b:/=/,e:/;/},{b:/\(/,e:/\)/},{bK:"new throw return else",e:/;/}],k:c,c:n.concat([{b:/\(/,e:/\)/,k:c,c:n.concat(["self"]),r:0}]),r:0},{cN:"function",b:"("+t.IR+"[\\*&\\s]+)+"+a,rB:!0,e:/[{;=]/,eE:!0,k:c,i:/[^\w\s\*&]/,c:[{b:a,rB:!0,c:[t.TM],r:0},{cN:"params",b:/\(/,e:/\)/,k:c,r:0,c:[t.CLCM,t.CBCM,r,s,e]},t.CLCM,t.CBCM,i]}]),exports:{preprocessor:i,strings:r,k:c}}});hljs.registerLanguage("css",function(e){var c="[a-zA-Z-][a-zA-Z0-9_-]*",t={b:/[A-Z\_\.\-]+\s*:/,rB:!0,e:";",eW:!0,c:[{cN:"attribute",b:/\S/,e:":",eE:!0,starts:{eW:!0,eE:!0,c:[{b:/[\w-]+\(/,rB:!0,c:[{cN:"built_in",b:/[\w-]+/},{b:/\(/,e:/\)/,c:[e.ASM,e.QSM]}]},e.CSSNM,e.QSM,e.ASM,e.CBCM,{cN:"number",b:"#[0-9A-Fa-f]+"},{cN:"meta",b:"!important"}]}}]};return{cI:!0,i:/[=\/|'\$]/,c:[e.CBCM,{cN:"selector-id",b:/#[A-Za-z0-9_-]+/},{cN:"selector-class",b:/\.[A-Za-z0-9_-]+/},{cN:"selector-attr",b:/\[/,e:/\]/,i:"$"},{cN:"selector-pseudo",b:/:(:)?[a-zA-Z0-9\_\-\+\(\)"'.]+/},{b:"@(font-face|page)",l:"[a-z-]+",k:"font-face page"},{b:"@",e:"[{;]",i:/:/,c:[{cN:"keyword",b:/\w+/},{b:/\s/,eW:!0,eE:!0,r:0,c:[e.ASM,e.QSM,e.CSSNM]}]},{cN:"selector-tag",b:c,r:0},{b:"{",e:"}",i:/\S/,c:[e.CBCM,t]}]}});hljs.registerLanguage("makefile",function(e){var a={cN:"variable",b:/\$\(/,e:/\)/,c:[e.BE]};return{aliases:["mk","mak"],c:[e.HCM,{b:/^\w+\s*\W*=/,rB:!0,r:0,starts:{e:/\s*\W*=/,eE:!0,starts:{e:/$/,r:0,c:[a]}}},{cN:"section",b:/^[\w]+:\s*$/},{cN:"meta",b:/^\.PHONY:/,e:/$/,k:{"meta-keyword":".PHONY"},l:/[\.\w]+/},{b:/^\t+/,e:/$/,r:0,c:[e.QSM,a]}]}});hljs.registerLanguage("objectivec",function(e){var t={cN:"built_in",b:"\\b(AV|CA|CF|CG|CI|CL|CM|CN|CT|MK|MP|MTK|MTL|NS|SCN|SK|UI|WK|XC)\\w+"},_={keyword:"int float while char export sizeof typedef const struct for union unsigned long volatile static bool mutable if do return goto void enum else break extern asm case short default double register explicit signed typename this switch continue wchar_t inline readonly assign readwrite self @synchronized id typeof nonatomic super unichar IBOutlet IBAction strong weak copy in out inout bycopy byref oneway __strong __weak __block __autoreleasing @private @protected @public @try @property @end @throw @catch @finally @autoreleasepool @synthesize @dynamic @selector @optional @required @encode @package @import @defs @compatibility_alias __bridge __bridge_transfer __bridge_retained __bridge_retain __covariant __contravariant __kindof _Nonnull _Nullable _Null_unspecified __FUNCTION__ __PRETTY_FUNCTION__ __attribute__ getter setter retain unsafe_unretained nonnull nullable null_unspecified null_resettable class instancetype NS_DESIGNATED_INITIALIZER NS_UNAVAILABLE NS_REQUIRES_SUPER NS_RETURNS_INNER_POINTER NS_INLINE NS_AVAILABLE NS_DEPRECATED NS_ENUM NS_OPTIONS NS_SWIFT_UNAVAILABLE NS_ASSUME_NONNULL_BEGIN NS_ASSUME_NONNULL_END NS_REFINED_FOR_SWIFT NS_SWIFT_NAME NS_SWIFT_NOTHROW NS_DURING NS_HANDLER NS_ENDHANDLER NS_VALUERETURN NS_VOIDRETURN",literal:"false true FALSE TRUE nil YES NO NULL",built_in:"BOOL dispatch_once_t dispatch_queue_t dispatch_sync dispatch_async dispatch_once"},i=/[a-zA-Z@][a-zA-Z0-9_]*/,n="@interface @class @protocol @implementation";return{aliases:["mm","objc","obj-c"],k:_,l:i,i:"</",c:[t,e.CLCM,e.CBCM,e.CNM,e.QSM,{cN:"string",v:[{b:'@"',e:'"',i:"\\n",c:[e.BE]},{b:"'",e:"[^\\\\]'",i:"[^\\\\][^']"}]},{cN:"meta",b:"#",e:"$",c:[{cN:"meta-string",v:[{b:'"',e:'"'},{b:"<",e:">"}]}]},{cN:"class",b:"("+n.split(" ").join("|")+")\\b",e:"({|$)",eE:!0,k:n,l:i,c:[e.UTM]},{b:"\\."+e.UIR,r:0}]}});hljs.registerLanguage("nginx",function(e){var r={cN:"variable",v:[{b:/\$\d+/},{b:/\$\{/,e:/}/},{b:"[\\$\\@]"+e.UIR}]},b={eW:!0,l:"[a-z/_]+",k:{literal:"on off yes no true false none blocked debug info notice warn error crit select break last permanent redirect kqueue rtsig epoll poll /dev/poll"},r:0,i:"=>",c:[e.HCM,{cN:"string",c:[e.BE,r],v:[{b:/"/,e:/"/},{b:/'/,e:/'/}]},{b:"([a-z]+):/",e:"\\s",eW:!0,eE:!0,c:[r]},{cN:"regexp",c:[e.BE,r],v:[{b:"\\s\\^",e:"\\s|{|;",rE:!0},{b:"~\\*?\\s+",e:"\\s|{|;",rE:!0},{b:"\\*(\\.[a-z\\-]+)+"},{b:"([a-z\\-]+\\.)+\\*"}]},{cN:"number",b:"\\b\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}(:\\d{1,5})?\\b"},{cN:"number",b:"\\b\\d+[kKmMgGdshdwy]*\\b",r:0},r]};return{aliases:["nginxconf"],c:[e.HCM,{b:e.UIR+"\\s+{",rB:!0,e:"{",c:[{cN:"section",b:e.UIR}],r:0},{b:e.UIR+"\\s",e:";|{",rB:!0,c:[{cN:"attribute",b:e.UIR,starts:b}],r:0}],i:"[^\\s\\}]"}});hljs.registerLanguage("python",function(e){var r={cN:"meta",b:/^(>>>|\.\.\.) /},b={cN:"string",c:[e.BE],v:[{b:/(u|b)?r?'''/,e:/'''/,c:[r],r:10},{b:/(u|b)?r?"""/,e:/"""/,c:[r],r:10},{b:/(u|r|ur)'/,e:/'/,r:10},{b:/(u|r|ur)"/,e:/"/,r:10},{b:/(b|br)'/,e:/'/},{b:/(b|br)"/,e:/"/},e.ASM,e.QSM]},a={cN:"number",r:0,v:[{b:e.BNR+"[lLjJ]?"},{b:"\\b(0o[0-7]+)[lLjJ]?"},{b:e.CNR+"[lLjJ]?"}]},l={cN:"params",b:/\(/,e:/\)/,c:["self",r,a,b]};return{aliases:["py","gyp"],k:{keyword:"and elif is global as in if from raise for except finally print import pass return exec else break not with class assert yield try while continue del or def lambda async await nonlocal|10 None True False",built_in:"Ellipsis NotImplemented"},i:/(<\/|->|\?)/,c:[r,a,b,e.HCM,{v:[{cN:"function",bK:"def",r:10},{cN:"class",bK:"class"}],e:/:/,i:/[${=;\n,]/,c:[e.UTM,l,{b:/->/,eW:!0,k:"None"}]},{cN:"meta",b:/^[\t ]*@/,e:/$/},{b:/\b(print|exec)\(/}]}});hljs.registerLanguage("diff",function(e){return{aliases:["patch"],c:[{cN:"meta",r:10,v:[{b:/^@@ +\-\d+,\d+ +\+\d+,\d+ +@@$/},{b:/^\*\*\* +\d+,\d+ +\*\*\*\*$/},{b:/^\-\-\- +\d+,\d+ +\-\-\-\-$/}]},{cN:"comment",v:[{b:/Index: /,e:/$/},{b:/={3,}/,e:/$/},{b:/^\-{3}/,e:/$/},{b:/^\*{3} /,e:/$/},{b:/^\+{3}/,e:/$/},{b:/\*{5}/,e:/\*{5}$/}]},{cN:"addition",b:"^\\+",e:"$"},{cN:"deletion",b:"^\\-",e:"$"},{cN:"addition",b:"^\\!",e:"$"}]}});hljs.registerLanguage("java",function(e){var t=e.UIR+"(<"+e.UIR+"(\\s*,\\s*"+e.UIR+")*>)?",a="false synchronized int abstract float private char boolean static null if const for true while long strictfp finally protected import native final void enum else break transient catch instanceof byte super volatile case assert short package default double public try this switch continue throws protected public private module requires exports",r="\\b(0[bB]([01]+[01_]+[01]+|[01]+)|0[xX]([a-fA-F0-9]+[a-fA-F0-9_]+[a-fA-F0-9]+|[a-fA-F0-9]+)|(([\\d]+[\\d_]+[\\d]+|[\\d]+)(\\.([\\d]+[\\d_]+[\\d]+|[\\d]+))?|\\.([\\d]+[\\d_]+[\\d]+|[\\d]+))([eE][-+]?\\d+)?)[lLfF]?",s={cN:"number",b:r,r:0};return{aliases:["jsp"],k:a,i:/<\/|#/,c:[e.C("/\\*\\*","\\*/",{r:0,c:[{b:/\w+@/,r:0},{cN:"doctag",b:"@[A-Za-z]+"}]}),e.CLCM,e.CBCM,e.ASM,e.QSM,{cN:"class",bK:"class interface",e:/[{;=]/,eE:!0,k:"class interface",i:/[:"\[\]]/,c:[{bK:"extends implements"},e.UTM]},{bK:"new throw return else",r:0},{cN:"function",b:"("+t+"\\s+)+"+e.UIR+"\\s*\\(",rB:!0,e:/[{;=]/,eE:!0,k:a,c:[{b:e.UIR+"\\s*\\(",rB:!0,r:0,c:[e.UTM]},{cN:"params",b:/\(/,e:/\)/,k:a,r:0,c:[e.ASM,e.QSM,e.CNM,e.CBCM]},e.CLCM,e.CBCM]},s,{cN:"meta",b:"@[A-Za-z]+"}]}});hljs.registerLanguage("bash",function(e){var t={cN:"variable",v:[{b:/\$[\w\d#@][\w\d_]*/},{b:/\$\{(.*?)}/}]},s={cN:"string",b:/"/,e:/"/,c:[e.BE,t,{cN:"variable",b:/\$\(/,e:/\)/,c:[e.BE]}]},a={cN:"string",b:/'/,e:/'/};return{aliases:["sh","zsh"],l:/-?[a-z\._]+/,k:{keyword:"if then else elif fi for while in do done case esac function",literal:"true false",built_in:"break cd continue eval exec exit export getopts hash pwd readonly return shift test times trap umask unset alias bind builtin caller command declare echo enable help let local logout mapfile printf read readarray source type typeset ulimit unalias set shopt autoload bg bindkey bye cap chdir clone comparguments compcall compctl compdescribe compfiles compgroups compquote comptags comptry compvalues dirs disable disown echotc echoti emulate fc fg float functions getcap getln history integer jobs kill limit log noglob popd print pushd pushln rehash sched setcap setopt stat suspend ttyctl unfunction unhash unlimit unsetopt vared wait whence where which zcompile zformat zftp zle zmodload zparseopts zprof zpty zregexparse zsocket zstyle ztcp",_:"-ne -eq -lt -gt -f -d -e -s -l -a"},c:[{cN:"meta",b:/^#![^\n]+sh\s*$/,r:10},{cN:"function",b:/\w[\w\d_]*\s*\(\s*\)\s*\{/,rB:!0,c:[e.inherit(e.TM,{b:/\w[\w\d_]*/})],r:0},e.HCM,s,a,t]}});hljs.registerLanguage("javascript",function(e){var r="[A-Za-z$_][0-9A-Za-z$_]*",t={keyword:"in of if for while finally var new function do return void else break catch instanceof with throw case default try this switch continue typeof delete let yield const export super debugger as async await static import from as",literal:"true false null undefined NaN Infinity",built_in:"eval isFinite isNaN parseFloat parseInt decodeURI decodeURIComponent encodeURI encodeURIComponent escape unescape Object Function Boolean Error EvalError InternalError RangeError ReferenceError StopIteration SyntaxError TypeError URIError Number Math Date String RegExp Array Float32Array Float64Array Int16Array Int32Array Int8Array Uint16Array Uint32Array Uint8Array Uint8ClampedArray ArrayBuffer DataView JSON Intl arguments require module console window document Symbol Set Map WeakSet WeakMap Proxy Reflect Promise"},a={cN:"number",v:[{b:"\\b(0[bB][01]+)"},{b:"\\b(0[oO][0-7]+)"},{b:e.CNR}],r:0},n={cN:"subst",b:"\\$\\{",e:"\\}",k:t,c:[]},c={cN:"string",b:"`",e:"`",c:[e.BE,n]};n.c=[e.ASM,e.QSM,c,a,e.RM];var s=n.c.concat([e.CBCM,e.CLCM]);return{aliases:["js","jsx"],k:t,c:[{cN:"meta",r:10,b:/^\s*['"]use (strict|asm)['"]/},{cN:"meta",b:/^#!/,e:/$/},e.ASM,e.QSM,c,e.CLCM,e.CBCM,a,{b:/[{,]\s*/,r:0,c:[{b:r+"\\s*:",rB:!0,r:0,c:[{cN:"attr",b:r,r:0}]}]},{b:"("+e.RSR+"|\\b(case|return|throw)\\b)\\s*",k:"return throw case",c:[e.CLCM,e.CBCM,e.RM,{cN:"function",b:"(\\(.*?\\)|"+r+")\\s*=>",rB:!0,e:"\\s*=>",c:[{cN:"params",v:[{b:r},{b:/\(\s*\)/},{b:/\(/,e:/\)/,eB:!0,eE:!0,k:t,c:s}]}]},{b:/</,e:/(\/\w+|\w+\/)>/,sL:"xml",c:[{b:/<\w+\s*\/>/,skip:!0},{b:/<\w+/,e:/(\/\w+|\w+\/)>/,skip:!0,c:[{b:/<\w+\s*\/>/,skip:!0},"self"]}]}],r:0},{cN:"function",bK:"function",e:/\{/,eE:!0,c:[e.inherit(e.TM,{b:r}),{cN:"params",b:/\(/,e:/\)/,eB:!0,eE:!0,c:s}],i:/\[|%/},{b:/\$[(.]/},e.METHOD_GUARD,{cN:"class",bK:"class",e:/[{;=]/,eE:!0,i:/[:"\[\]]/,c:[{bK:"extends"},e.UTM]},{bK:"constructor",e:/\{/,eE:!0}],i:/#(?!!)/}});hljs.registerLanguage("scala",function(e){var t={cN:"meta",b:"@[A-Za-z]+"},a={cN:"subst",v:[{b:"\\$[A-Za-z0-9_]+"},{b:"\\${",e:"}"}]},r={cN:"string",v:[{b:'"',e:'"',i:"\\n",c:[e.BE]},{b:'"""',e:'"""',r:10},{b:'[a-z]+"',e:'"',i:"\\n",c:[e.BE,a]},{cN:"string",b:'[a-z]+"""',e:'"""',c:[a],r:10}]},c={cN:"symbol",b:"'\\w[\\w\\d_]*(?!')"},i={cN:"type",b:"\\b[A-Z][A-Za-z0-9_]*",r:0},s={cN:"title",b:/[^0-9\n\t "'(),.`{}\[\]:;][^\n\t "'(),.`{}\[\]:;]+|[^0-9\n\t "'(),.`{}\[\]:;=]/,r:0},n={cN:"class",bK:"class object trait type",e:/[:={\[\n;]/,eE:!0,c:[{bK:"extends with",r:10},{b:/\[/,e:/\]/,eB:!0,eE:!0,r:0,c:[i]},{cN:"params",b:/\(/,e:/\)/,eB:!0,eE:!0,r:0,c:[i]},s]},l={cN:"function",bK:"def",e:/[:={\[(\n;]/,eE:!0,c:[s]};return{k:{literal:"true false null",keyword:"type yield lazy override def with val var sealed abstract private trait object if forSome for while throw finally protected extends import final return else break new catch super class case package default try this match continue throws implicit"},c:[e.CLCM,e.CBCM,r,c,i,l,n,e.CNM,t]}}); \ No newline at end of file
diff --git a/docs/SyntaxSummary.txt b/docs/syntax-summary.txt
index 519180775..519180775 100644
--- a/docs/SyntaxSummary.txt
+++ b/docs/syntax-summary.txt
diff --git a/docs/usage/migrating.md b/docs/usage/migrating.md
new file mode 100644
index 000000000..d835aeea6
--- /dev/null
+++ b/docs/usage/migrating.md
@@ -0,0 +1,46 @@
+---
+layout: default
+title: "Migrating to Dotty"
+---
+
+Migrating to Dotty
+==================
+
+### Minor tweaks ###
+ * `sym.linkedClassOfClass` => `sym.linkedClass`
+ * `definitions` => `ctx.definitions`
+
+### Member Lookup ###
+`tpe.member(name)` and `tpe.decl(name)` now return a `Denotation`, not a
+`Symbol`. If no definition is found they return `NoDenotation` (instead of
+`NoSymbol`).
+
+### Symbol Properties ###
+Most `sym.isProperty` methods don't exist in dotc, test for flags instead. See
+[dotc vs scalac: Trees, Symbols, Types & Denotations]
+
+### Logging, Error Reporting, Failures ###
+
+There are various kinds of logging:
+ * Errors, warnings, etc: `ctx.inform`, `ctx.warning`, `ctx.error`, ...
+ * Log messages displayed under `-Ylog:phase`: `log(msg)` in scalac =>
+ `ctx.log(msg)` in dotc
+ * Debug-Log messages displayed under `-Ydebug -Ylog:<phase>`: `debuglog(msg)`
+ in scalac => `ctx.debuglog(msg)` in dotc
+ * Assertions: `assert(invariant)`
+ * Fatal errors: `abort(msg)` in scalac => `throw new
+ dotty.tools.dotc.FatalError(msg)` in dotc
+
+
+#### During development / debugging ####
+Instead of `Predef.println`, use `dotc.config.Printers.somePrinter.println`
+[Printers.scala]. Printers can be easily added, enabled and disabled
+without changing command line arguments.
+
+```scala
+val default: Printer = new Printer // new Printer => print
+val core: Printer = noPrinter // noPrinter => shut up
+```
+
+[dotc vs scalac: Trees, Symbols, Types & Denotations]: https://github.com/lampepfl/dotty/wiki/dotc-vs-scalac:-Trees,-Symbols,-Types-&-Denotations
+[Printers.scala]: https://github.com/lampepfl/dotty/blob/master/src/dotty/tools/dotc/config/Printers.scala
diff --git a/docs/usage/sbt-projects.md b/docs/usage/sbt-projects.md
new file mode 100644
index 000000000..79418850d
--- /dev/null
+++ b/docs/usage/sbt-projects.md
@@ -0,0 +1,14 @@
+---
+layout: default
+title: "sbt"
+---
+
+Using Dotty with sbt
+====================
+It is now possible to use Dotty with sbt thanks to the dotty-bridge project.
+There are two alternatives in how to create an sbt project that uses dotty:
+
+* [dotty-example-project](https://github.com/smarter/dotty-example-project)
+ for a simple sbt project that compiles code with Dotty
+* [sbt-dotty](https://github.com/felixmulder/sbt-dotty) an sbt plugin that
+ takes care of all dependencies and settings needed to get a Dotty sbt project