summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorAntonio Cunei <antonio.cunei@epfl.ch>2010-01-15 17:18:18 +0000
committerAntonio Cunei <antonio.cunei@epfl.ch>2010-01-15 17:18:18 +0000
commit1ee78f5a8d1ac4beb3dff19f8aa147feea508e33 (patch)
tree977e78a276ba4939cc40d55dd77149ae3e0e10e9
parent914d6ff44e34a1c15a920d6c055df062cc31b4f3 (diff)
downloadscala-1ee78f5a8d1ac4beb3dff19f8aa147feea508e33.tar.gz
scala-1ee78f5a8d1ac4beb3dff19f8aa147feea508e33.tar.bz2
scala-1ee78f5a8d1ac4beb3dff19f8aa147feea508e33.zip
Merged revisions 20429,20437-20438,20444-20447,...
Merged revisions 20429,20437-20438,20444-20447,20449-20451,20453,20456-20457,20459,20463- 20466,20468-20469,20476-20478,20480-20482,20484,20486-20491,20495-20496, 20500-20502,20504,20515,20519,20522-20525 via svnmerge from https://lampsvn.epfl.ch/svn-repos/scala/scala/trunk ........ r20429 | milessabin | 2010-01-10 13:08:40 +0100 (Sun, 10 Jan 2010) | 1 line Weaken the test for co-definition to equality of paths (equality of files fails where implementing types differ). Review by odersky. ........ r20437 | extempore | 2010-01-11 05:58:17 +0100 (Mon, 11 Jan 2010) | 3 lines Fix for an unfortunate bug introduced in r19020 which was causing a great many unnecessary and unreachable MatchErrors to appear in generated bytecode. ........ r20438 | rytz | 2010-01-11 09:55:42 +0100 (Mon, 11 Jan 2010) | 1 line speed up ClassPath.findClass. review by community ........ r20444 | prokopec | 2010-01-11 16:44:22 +0100 (Mon, 11 Jan 2010) | 2 lines Red black tree patch and test. no review ........ r20445 | odersky | 2010-01-11 16:48:20 +0100 (Mon, 11 Jan 2010) | 1 line Revised List#mapConserve so that it tests wrt eq not ==. ........ r20446 | odersky | 2010-01-11 16:48:58 +0100 (Mon, 11 Jan 2010) | 1 line Removed extraneous clause is isStrictlyMoreSpecific ........ r20447 | odersky | 2010-01-11 16:49:51 +0100 (Mon, 11 Jan 2010) | 1 line Avoided a crash scenario in the presentation compiler. ........ r20449 | prokopec | 2010-01-11 17:09:36 +0100 (Mon, 11 Jan 2010) | 2 lines Fixed #2810. no review ........ r20450 | extempore | 2010-01-11 17:26:44 +0100 (Mon, 11 Jan 2010) | 3 lines Fix for #2883, a regression introduced in r18789. It was only a regression because the pattern matcher has extractor bugs which don't manifest for case classes. Underlying bug remains. No review. ........ r20451 | extempore | 2010-01-11 17:31:44 +0100 (Mon, 11 Jan 2010) | 1 line Commented out some (debugging?) code which was breaking the build. ........ r20453 | extempore | 2010-01-11 18:16:56 +0100 (Mon, 11 Jan 2010) | 1 line Fix and test case for #2364, which regressed with the fix to #2721. ........ r20456 | milessabin | 2010-01-11 18:44:32 +0100 (Mon, 11 Jan 2010) | 1 line Reverting stray commit to Eclipse metadata. ........ r20457 | extempore | 2010-01-11 18:57:42 +0100 (Mon, 11 Jan 2010) | 2 lines Moved the test for #2364 to pending because it apparently uses some nonstandard classes. ........ r20459 | extempore | 2010-01-12 00:29:25 +0100 (Tue, 12 Jan 2010) | 8 lines A few repl features. Added the following commands: :history <N> shows N lines of history :h? <str> greps the history for str Altered tab-completion to be less verbose on the first tab, but notice when tab has been hit twice without any other input, and then be more verbose. And prettified the repl help text. ........ r20463 | plocinic | 2010-01-12 12:24:14 +0100 (Tue, 12 Jan 2010) | 1 line Compare typeParams correctly for symbols so that the build manager no longer reports false changes, cloneInfo instead instead of symbols. No review. ........ r20464 | extempore | 2010-01-12 14:44:43 +0100 (Tue, 12 Jan 2010) | 1 line Added toMap to TraversableLike. ........ r20465 | extempore | 2010-01-12 15:11:02 +0100 (Tue, 12 Jan 2010) | 2 lines A fix for at least one manifestation of #2865. Infinite collections and "size" don't mix! ........ r20466 | cunei | 2010-01-12 16:11:07 +0100 (Tue, 12 Jan 2010) | 3 lines Deprecated "=>?". Closes #2860 (see #2819). ........ r20468 | dragos | 2010-01-12 17:55:26 +0100 (Tue, 12 Jan 2010) | 5 lines Updated attempt at removing @inline warnings: - fixed a bug in closure elimination causing VerifyErrors - fixed a broken assert in GenICode that fired when -Ydebug was used - added final modifiers ........ r20469 | odersky | 2010-01-12 18:17:50 +0100 (Tue, 12 Jan 2010) | 1 line Closes #2867. review by extempore. ........ r20476 | plocinic | 2010-01-13 12:05:24 +0100 (Wed, 13 Jan 2010) | 1 line When comparing type aliases use info instead of tpe so that the changes are correctly detected. Fixes #2650. No review. ........ r20477 | extempore | 2010-01-13 14:08:53 +0100 (Wed, 13 Jan 2010) | 2 lines Overrode slice in the StringLike derivatives to use substring. Closes #2895. Review by community. ........ r20478 | odersky | 2010-01-13 14:37:30 +0100 (Wed, 13 Jan 2010) | 1 line more statistics hooks. no review necessary. ........ r20480 | extempore | 2010-01-13 15:38:37 +0100 (Wed, 13 Jan 2010) | 3 lines A variety of bugfixes discovered by findbugs. Most of them are examples of equality comparisons which are guaranteed to return false because someone is not comparing what they think they're comparing. ........ r20481 | moors | 2010-01-13 16:44:11 +0100 (Wed, 13 Jan 2010) | 2 lines closes #2421: more complete fix, now also check validity of inferred type arguments for expressions inferred for implicit values review by odersky ........ r20482 | extempore | 2010-01-13 17:02:41 +0100 (Wed, 13 Jan 2010) | 1 line Fix for #2817. Review by mharrah. ........ r20484 | odersky | 2010-01-13 17:27:40 +0100 (Wed, 13 Jan 2010) | 1 line Fixes #2755, but leaving open to analyze issue raised by Paul. review by extempore. ........ r20486 | odersky | 2010-01-13 17:36:06 +0100 (Wed, 13 Jan 2010) | 1 line Closes #2866, #2870. Attempt to fix #2733 by having only non-local members be visible for imports. However, this causes the interpreter to fail. review by extempore. ........ r20487 | odersky | 2010-01-13 17:36:29 +0100 (Wed, 13 Jan 2010) | 1 line new tests. no review. ........ r20488 | dubochet | 2010-01-13 18:00:14 +0100 (Wed, 13 Jan 2010) | 1 line [scaladoc] Use cases are printed. Reduced memory footprint of Scaladoc model. Review by community. ........ r20489 | moors | 2010-01-13 18:27:01 +0100 (Wed, 13 Jan 2010) | 1 line better fix for see #2421 after feedback from Martin ........ r20490 | extempore | 2010-01-13 23:50:40 +0100 (Wed, 13 Jan 2010) | 2 lines Reverts r20311 since I'm not seeing what's going on in #2876 and the optimization can wait. ........ r20491 | prokopec | 2010-01-14 00:42:33 +0100 (Thu, 14 Jan 2010) | 1 line Added ConcurrentMap and Properties conversion classes and test. ........ r20495 | extempore | 2010-01-14 02:22:05 +0100 (Thu, 14 Jan 2010) | 2 lines Finished up fixing #2773. Interpreter tries not to accidentally import synthetic locals from previous scopes. ........ r20496 | rytz | 2010-01-14 09:43:51 +0100 (Thu, 14 Jan 2010) | 1 line fix for .net compiler (flatten is skipped). no review (already done by dragos). ........ r20500 | dubochet | 2010-01-14 14:22:03 +0100 (Thu, 14 Jan 2010) | 1 line [scaladoc] Use cases are marked as such in the documentation (using some changes in r20488). Review by odersky. ........ r20501 | milessabin | 2010-01-14 17:10:18 +0100 (Thu, 14 Jan 2010) | 1 line Fixed #2889. No review necessary. ........ r20502 | plocinic | 2010-01-14 17:29:39 +0100 (Thu, 14 Jan 2010) | 1 line Closes #2649. No review. ........ r20504 | prokopec | 2010-01-14 18:02:45 +0100 (Thu, 14 Jan 2010) | 4 lines ConcurrentMap trait added to collection.mutable. JavaConversions now include conversions between Java ConcurrentMap objects and Scala ConcurrentMap objects. review by odersky ........ r20515 | phaller | 2010-01-15 00:55:41 +0100 (Fri, 15 Jan 2010) | 1 line Some optimizations to actor message queues and event handling. ........ r20519 | extempore | 2010-01-15 03:12:10 +0100 (Fri, 15 Jan 2010) | 1 line Fix and test for #2354. Review by community. ........ r20522 | milessabin | 2010-01-15 15:12:20 +0100 (Fri, 15 Jan 2010) | 1 line Fixes for various Scaladoc-related positions regressions with tests. Review by dubochet. ........ r20523 | dubochet | 2010-01-15 15:41:19 +0100 (Fri, 15 Jan 2010) | 1 line Fixed issue when searching for companion of class using "linkedSym" when value of same name is overloaded (for example companion of the Value class in an Enumeration). No review, already checked by odersky. ........ r20524 | dubochet | 2010-01-15 15:43:34 +0100 (Fri, 15 Jan 2010) | 1 line [scaladoc] Companion classes are printed. Original code contributed by Pedro Furlanetto. No review, checked by dubochet. ........ r20525 | cunei | 2010-01-15 17:58:28 +0100 (Fri, 15 Jan 2010) | 2 lines Reverted over-zealous replacement of 'PartialFunction' with '=>?'. ........
-rw-r--r--src/actors/scala/actors/Actor.scala54
-rw-r--r--src/actors/scala/actors/Channel.scala8
-rw-r--r--src/actors/scala/actors/Future.scala6
-rw-r--r--src/actors/scala/actors/InputChannel.scala8
-rw-r--r--src/actors/scala/actors/MessageQueue.scala65
-rw-r--r--src/actors/scala/actors/ReactChannel.scala8
-rw-r--r--src/actors/scala/actors/Reaction.scala2
-rw-r--r--src/actors/scala/actors/Reactor.scala62
-rw-r--r--src/actors/scala/actors/ReactorTask.scala2
-rw-r--r--src/actors/scala/actors/ReplyReactor.scala16
-rw-r--r--src/actors/scala/actors/Replyable.scala2
-rw-r--r--src/actors/scala/actors/ReplyableActor.scala8
-rw-r--r--src/actors/scala/actors/ReplyableReactor.scala2
-rw-r--r--src/actors/scala/actors/remote/Proxy.scala2
-rw-r--r--src/compiler/scala/tools/ant/sabbus/Settings.scala1
-rw-r--r--src/compiler/scala/tools/nsc/Interpreter.scala45
-rw-r--r--src/compiler/scala/tools/nsc/InterpreterLoop.scala75
-rw-r--r--src/compiler/scala/tools/nsc/Settings.scala2
-rwxr-xr-xsrc/compiler/scala/tools/nsc/ast/DocComments.scala14
-rw-r--r--src/compiler/scala/tools/nsc/ast/parser/MarkupParsers.scala54
-rw-r--r--src/compiler/scala/tools/nsc/ast/parser/Parsers.scala30
-rw-r--r--src/compiler/scala/tools/nsc/ast/parser/Scanners.scala5
-rw-r--r--src/compiler/scala/tools/nsc/backend/icode/GenICode.scala11
-rw-r--r--src/compiler/scala/tools/nsc/backend/icode/analysis/CopyPropagation.scala6
-rw-r--r--src/compiler/scala/tools/nsc/backend/opt/Inliners.scala2
-rw-r--r--src/compiler/scala/tools/nsc/dependencies/Changes.scala38
-rw-r--r--src/compiler/scala/tools/nsc/doc/DocFactory.scala5
-rw-r--r--src/compiler/scala/tools/nsc/doc/DocProvider.scala3
-rw-r--r--src/compiler/scala/tools/nsc/doc/SourcelessComments.scala11
-rw-r--r--src/compiler/scala/tools/nsc/doc/html/page/Template.scala34
-rw-r--r--src/compiler/scala/tools/nsc/doc/model/Entity.scala5
-rw-r--r--src/compiler/scala/tools/nsc/doc/model/ModelFactory.scala257
-rw-r--r--src/compiler/scala/tools/nsc/doc/model/comment/CommentFactory.scala2
-rw-r--r--src/compiler/scala/tools/nsc/interactive/RefinedBuildManager.scala25
-rw-r--r--src/compiler/scala/tools/nsc/interpreter/Completion.scala39
-rw-r--r--src/compiler/scala/tools/nsc/interpreter/InteractiveReader.scala4
-rw-r--r--src/compiler/scala/tools/nsc/interpreter/JLineReader.scala18
-rw-r--r--src/compiler/scala/tools/nsc/interpreter/SimpleReader.scala2
-rw-r--r--src/compiler/scala/tools/nsc/matching/MatrixAdditions.scala2
-rw-r--r--src/compiler/scala/tools/nsc/matching/ParallelMatching.scala4
-rw-r--r--src/compiler/scala/tools/nsc/symtab/Definitions.scala2
-rw-r--r--src/compiler/scala/tools/nsc/symtab/Symbols.scala19
-rw-r--r--src/compiler/scala/tools/nsc/symtab/Types.scala331
-rw-r--r--src/compiler/scala/tools/nsc/transform/CleanUp.scala2
-rw-r--r--src/compiler/scala/tools/nsc/transform/Erasure.scala23
-rw-r--r--src/compiler/scala/tools/nsc/transform/SpecializeTypes.scala7
-rw-r--r--src/compiler/scala/tools/nsc/transform/UnCurry.scala4
-rw-r--r--src/compiler/scala/tools/nsc/typechecker/Contexts.scala38
-rw-r--r--src/compiler/scala/tools/nsc/typechecker/Implicits.scala16
-rw-r--r--src/compiler/scala/tools/nsc/typechecker/Infer.scala10
-rw-r--r--src/compiler/scala/tools/nsc/typechecker/Namers.scala6
-rw-r--r--src/compiler/scala/tools/nsc/typechecker/SuperAccessors.scala2
-rw-r--r--src/compiler/scala/tools/nsc/typechecker/SyntheticMethods.scala11
-rw-r--r--src/compiler/scala/tools/nsc/typechecker/Typers.scala101
-rw-r--r--src/compiler/scala/tools/nsc/util/ClassPath.scala32
-rw-r--r--src/compiler/scala/tools/nsc/util/Statistics.scala73
-rw-r--r--src/library/scala/Option.scala2
-rw-r--r--src/library/scala/PartialFunction.scala8
-rw-r--r--src/library/scala/Predef.scala4
-rw-r--r--src/library/scala/collection/IndexedSeqViewLike.scala76
-rw-r--r--src/library/scala/collection/IterableViewLike.scala28
-rw-r--r--src/library/scala/collection/Iterator.scala7
-rw-r--r--src/library/scala/collection/JavaConversions.scala164
-rw-r--r--src/library/scala/collection/SeqViewLike.scala26
-rw-r--r--src/library/scala/collection/TraversableLike.scala24
-rw-r--r--src/library/scala/collection/TraversableProxyLike.scala2
-rw-r--r--src/library/scala/collection/TraversableViewLike.scala23
-rw-r--r--src/library/scala/collection/immutable/List.scala8
-rw-r--r--src/library/scala/collection/immutable/RedBlack.scala77
-rw-r--r--src/library/scala/collection/immutable/StringOps.scala5
-rw-r--r--src/library/scala/collection/immutable/WrappedString.scala3
-rw-r--r--src/library/scala/collection/interfaces/TraversableMethods.scala2
-rw-r--r--src/library/scala/collection/mutable/ConcurrentMap.scala79
-rw-r--r--src/library/scala/collection/mutable/FlatHashTable.scala2
-rw-r--r--src/library/scala/collection/mutable/IndexedSeqView.scala5
-rw-r--r--src/library/scala/collection/mutable/PriorityQueue.scala2
-rw-r--r--src/library/scala/collection/views/Transformed.scala128
-rw-r--r--src/library/scala/concurrent/MailBox.scala6
-rw-r--r--src/library/scala/io/Source.scala2
-rw-r--r--src/library/scala/math/BigDecimal.scala2
-rw-r--r--src/library/scala/package.scala2
-rw-r--r--src/library/scala/runtime/ScalaRunTime.scala4
-rw-r--r--src/library/scala/util/control/Exception.scala14
-rw-r--r--src/library/scala/util/parsing/combinator/Parsers.scala14
-rw-r--r--src/library/scala/xml/Text.scala30
-rw-r--r--src/library/scala/xml/Utility.scala7
-rw-r--r--src/library/scala/xml/parsing/FactoryAdapter.scala4
-rw-r--r--src/library/scala/xml/parsing/MarkupParser.scala27
-rw-r--r--src/library/scala/xml/parsing/MarkupParserCommon.scala48
-rw-r--r--src/scalap/scala/tools/scalap/Main.scala6
-rw-r--r--src/scalap/scala/tools/scalap/scalax/rules/Rule.scala6
-rw-r--r--src/swing/scala/swing/Reactions.scala2
-rw-r--r--test/files/neg/t2421b.check4
-rw-r--r--test/files/neg/t2421b.scala17
-rw-r--r--test/files/neg/t2641.check2
-rw-r--r--test/files/neg/t2870.check7
-rwxr-xr-xtest/files/neg/t2870.scala9
-rw-r--r--test/files/pos/t2421b.scala19
-rw-r--r--test/files/pos/t2810.scala8
-rw-r--r--test/files/pos/t2867.scala1
-rw-r--r--test/files/positions/Scaladoc6.scala10
-rw-r--r--test/files/positions/Scaladoc7.scala6
-rw-r--r--test/files/positions/Scaladoc8.scala6
-rw-r--r--test/files/run/bug2354.scala17
-rw-r--r--test/files/run/bug2876.scala7
-rw-r--r--test/files/run/map_java_conversions.scala60
-rw-r--r--test/files/run/t2849.scala46
-rw-r--r--test/pending/run/bug2364.check1
-rw-r--r--test/pending/run/bug2364.scala60
109 files changed, 1889 insertions, 856 deletions
diff --git a/src/actors/scala/actors/Actor.scala b/src/actors/scala/actors/Actor.scala
index 907389b9f0..838d3a8f63 100644
--- a/src/actors/scala/actors/Actor.scala
+++ b/src/actors/scala/actors/Actor.scala
@@ -160,7 +160,7 @@ object Actor {
* @param f a partial function specifying patterns and actions
* @return the result of processing the received message
*/
- def receive[A](f: Any =>? A): A =
+ def receive[A](f: PartialFunction[Any, A]): A =
self.receive(f)
/**
@@ -175,7 +175,7 @@ object Actor {
* @param f a partial function specifying patterns and actions
* @return the result of processing the received message
*/
- def receiveWithin[R](msec: Long)(f: Any =>? R): R =
+ def receiveWithin[R](msec: Long)(f: PartialFunction[Any, R]): R =
self.receiveWithin(msec)(f)
/**
@@ -188,7 +188,7 @@ object Actor {
* @param f a partial function specifying patterns and actions
* @return this function never returns
*/
- def react(f: Any =>? Unit): Nothing =
+ def react(f: PartialFunction[Any, Unit]): Nothing =
rawSelf.react(f)
/**
@@ -202,14 +202,14 @@ object Actor {
* @param f a partial function specifying patterns and actions
* @return this function never returns
*/
- def reactWithin(msec: Long)(f: Any =>? Unit): Nothing =
+ def reactWithin(msec: Long)(f: PartialFunction[Any, Unit]): Nothing =
self.reactWithin(msec)(f)
- def eventloop(f: Any =>? Unit): Nothing =
+ def eventloop(f: PartialFunction[Any, Unit]): Nothing =
rawSelf.react(new RecursiveProxyHandler(rawSelf, f))
- private class RecursiveProxyHandler(a: Reactor, f: Any =>? Unit)
- extends (Any =>? Unit) {
+ private class RecursiveProxyHandler(a: Reactor, f: PartialFunction[Any, Unit])
+ extends PartialFunction[Any, Unit] {
def isDefinedAt(m: Any): Boolean =
true // events are immediately removed from the mailbox
def apply(m: Any) {
@@ -261,9 +261,9 @@ object Actor {
* }
* </pre>
*/
- def respondOn[A, B](fun: A =>? Unit => Nothing):
- A =>? B => Responder[B] =
- (caseBlock: A =>? B) => new Responder[B] {
+ def respondOn[A, B](fun: PartialFunction[A, Unit] => Nothing):
+ PartialFunction[A, B] => Responder[B] =
+ (caseBlock: PartialFunction[A, B]) => new Responder[B] {
def respond(k: B => Unit) = fun(caseBlock andThen k)
}
@@ -400,7 +400,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
protected[actors] override def scheduler: IScheduler = Scheduler
- private[actors] override def startSearch(msg: Any, replyTo: OutputChannel[Any], handler: Any => Boolean) =
+ private[actors] override def startSearch(msg: Any, replyTo: OutputChannel[Any], handler: PartialFunction[Any, Any]) =
if (isSuspended) {
() => synchronized {
mailbox.append(msg, replyTo)
@@ -411,7 +411,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
private[actors] override def makeReaction(fun: () => Unit): Runnable =
new ActorTask(this, fun)
- private[actors] override def resumeReceiver(item: (Any, OutputChannel[Any]), onSameThread: Boolean) {
+ private[actors] override def resumeReceiver(item: (Any, OutputChannel[Any]), handler: PartialFunction[Any, Any], onSameThread: Boolean) {
synchronized {
if (!onTimeout.isEmpty) {
onTimeout.get.cancel()
@@ -419,7 +419,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
}
}
senders = List(item._2)
- super.resumeReceiver(item, onSameThread)
+ super.resumeReceiver(item, handler, onSameThread)
}
/**
@@ -428,7 +428,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receive[R](f: Any =>? R): R = {
+ def receive[R](f: PartialFunction[Any, R]): R = {
assert(Actor.self(scheduler) == this, "receive from channel belonging to other actor")
synchronized {
@@ -451,7 +451,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
drainSendBuffer(mailbox)
// keep going
} else {
- waitingFor = f.isDefinedAt
+ waitingFor = f
isSuspended = true
scheduler.managedBlock(blocker)
drainSendBuffer(mailbox)
@@ -479,7 +479,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receiveWithin[R](msec: Long)(f: Any =>? R): R = {
+ def receiveWithin[R](msec: Long)(f: PartialFunction[Any, R]): R = {
assert(Actor.self(scheduler) == this, "receive from channel belonging to other actor")
synchronized {
@@ -517,7 +517,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
done = true
receiveTimeout
} else {
- waitingFor = f.isDefinedAt
+ waitingFor = f
received = None
isSuspended = true
val thisActor = this
@@ -559,14 +559,13 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
*
* @param f a partial function with message patterns and actions
*/
- override def react(f: Any =>? Unit): Nothing = {
+ override def react(f: PartialFunction[Any, Unit]): Nothing = {
assert(Actor.self(scheduler) == this, "react on channel belonging to other actor")
synchronized {
if (shouldExit) exit() // links
drainSendBuffer(mailbox)
}
- continuation = f
- searchMailbox(mailbox, f.isDefinedAt, false)
+ searchMailbox(mailbox, f, false)
throw Actor.suspendException
}
@@ -580,7 +579,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
* @param msec the time span before timeout
* @param f a partial function with message patterns and actions
*/
- def reactWithin(msec: Long)(f: Any =>? Unit): Nothing = {
+ def reactWithin(msec: Long)(f: PartialFunction[Any, Unit]): Nothing = {
assert(Actor.self(scheduler) == this, "react on channel belonging to other actor")
synchronized {
@@ -616,8 +615,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
done = true
receiveTimeout
} else {
- waitingFor = f.isDefinedAt
- continuation = f
+ waitingFor = f
val thisActor = this
onTimeout = Some(new TimerTask {
def run() { thisActor.send(TIMEOUT, thisActor) }
@@ -647,14 +645,12 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
// guarded by lock of this
// never throws SuspendActorException
- private[actors] override def scheduleActor(f: Any =>? Unit, msg: Any) =
- if ((f eq null) && (continuation eq null)) {
+ private[actors] override def scheduleActor(f: PartialFunction[Any, Any], msg: Any) =
+ if (f eq null) {
// do nothing (timeout is handled instead)
}
else {
- val task = new Reaction(this,
- if (f eq null) continuation else f,
- msg)
+ val task = new Reaction(this, f, msg)
scheduler executeFromActor task
}
@@ -825,7 +821,7 @@ trait Actor extends AbstractActor with ReplyReactor with ReplyableActor {
if (isSuspended)
resumeActor()
else if (waitingFor ne waitingForNone) {
- scheduleActor(continuation, null)
+ scheduleActor(waitingFor, null)
/* Here we should not throw a SuspendActorException,
since the current method is called from an actor that
is in the process of exiting.
diff --git a/src/actors/scala/actors/Channel.scala b/src/actors/scala/actors/Channel.scala
index 1454f29214..24340d22f2 100644
--- a/src/actors/scala/actors/Channel.scala
+++ b/src/actors/scala/actors/Channel.scala
@@ -76,7 +76,7 @@ class Channel[Msg](val receiver: Actor) extends InputChannel[Msg] with OutputCha
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receive[R](f: Msg =>? R): R = {
+ def receive[R](f: PartialFunction[Msg, R]): R = {
val C = this.asInstanceOf[Channel[Any]]
val recvActor = receiver.asInstanceOf[Actor]
recvActor.receive {
@@ -99,7 +99,7 @@ class Channel[Msg](val receiver: Actor) extends InputChannel[Msg] with OutputCha
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receiveWithin[R](msec: Long)(f: Any =>? R): R = {
+ def receiveWithin[R](msec: Long)(f: PartialFunction[Any, R]): R = {
val C = this.asInstanceOf[Channel[Any]]
val recvActor = receiver.asInstanceOf[Actor]
recvActor.receiveWithin(msec) {
@@ -116,7 +116,7 @@ class Channel[Msg](val receiver: Actor) extends InputChannel[Msg] with OutputCha
*
* @param f a partial function with message patterns and actions
*/
- def react(f: Msg =>? Unit): Nothing = {
+ def react(f: PartialFunction[Msg, Unit]): Nothing = {
val C = this.asInstanceOf[Channel[Any]]
receiver.react {
case C ! msg if (f.isDefinedAt(msg.asInstanceOf[Msg])) => f(msg.asInstanceOf[Msg])
@@ -133,7 +133,7 @@ class Channel[Msg](val receiver: Actor) extends InputChannel[Msg] with OutputCha
* @param msec the time span before timeout
* @param f a partial function with message patterns and actions
*/
- def reactWithin(msec: Long)(f: Any =>? Unit): Nothing = {
+ def reactWithin(msec: Long)(f: PartialFunction[Any, Unit]): Nothing = {
val C = this.asInstanceOf[Channel[Any]]
val recvActor = receiver.asInstanceOf[Actor]
recvActor.reactWithin(msec) {
diff --git a/src/actors/scala/actors/Future.scala b/src/actors/scala/actors/Future.scala
index 1369ed6255..ebb0489d88 100644
--- a/src/actors/scala/actors/Future.scala
+++ b/src/actors/scala/actors/Future.scala
@@ -153,7 +153,7 @@ object Futures {
val partFuns = unsetFts.map((p: Pair[Int, Future[Any]]) => {
val FutCh = p._2.inputChannel
- val singleCase: Any =>? Pair[Int, Any] = {
+ val singleCase: PartialFunction[Any, Pair[Int, Any]] = {
case FutCh ! any => Pair(p._1, any)
}
singleCase
@@ -165,8 +165,8 @@ object Futures {
}
Actor.timer.schedule(timerTask, timeout)
- def awaitWith(partFuns: Seq[Any =>? Pair[Int, Any]]) {
- val reaction: Any =>? Unit = new (Any =>? Unit) {
+ def awaitWith(partFuns: Seq[PartialFunction[Any, Pair[Int, Any]]]) {
+ val reaction: PartialFunction[Any, Unit] = new PartialFunction[Any, Unit] {
def isDefinedAt(msg: Any) = msg match {
case TIMEOUT => true
case _ => partFuns exists (_ isDefinedAt msg)
diff --git a/src/actors/scala/actors/InputChannel.scala b/src/actors/scala/actors/InputChannel.scala
index fb922f27b2..46988159fa 100644
--- a/src/actors/scala/actors/InputChannel.scala
+++ b/src/actors/scala/actors/InputChannel.scala
@@ -25,7 +25,7 @@ trait InputChannel[+Msg] {
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receive[R](f: Msg =>? R): R
+ def receive[R](f: PartialFunction[Msg, R]): R
/**
* Receives a message from this <code>InputChannel</code> within
@@ -35,7 +35,7 @@ trait InputChannel[+Msg] {
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receiveWithin[R](msec: Long)(f: Any =>? R): R
+ def receiveWithin[R](msec: Long)(f: PartialFunction[Any, R]): R
/**
* Receives a message from this <code>InputChannel</code>.
@@ -45,7 +45,7 @@ trait InputChannel[+Msg] {
*
* @param f a partial function with message patterns and actions
*/
- def react(f: Msg =>? Unit): Nothing
+ def react(f: PartialFunction[Msg, Unit]): Nothing
/**
* Receives a message from this <code>InputChannel</code> within
@@ -57,7 +57,7 @@ trait InputChannel[+Msg] {
* @param msec the time span before timeout
* @param f a partial function with message patterns and actions
*/
- def reactWithin(msec: Long)(f: Any =>? Unit): Nothing
+ def reactWithin(msec: Long)(f: PartialFunction[Any, Unit]): Nothing
/**
* Receives the next message from this <code>Channel</code>.
diff --git a/src/actors/scala/actors/MessageQueue.scala b/src/actors/scala/actors/MessageQueue.scala
index fd43e36fff..000ff1bfc6 100644
--- a/src/actors/scala/actors/MessageQueue.scala
+++ b/src/actors/scala/actors/MessageQueue.scala
@@ -62,6 +62,15 @@ private[actors] class MQueue(protected val label: String) {
last = el
}
+ def append(el: MQueueElement) {
+ changeSize(1) // size always increases by 1
+
+ if (isEmpty) first = el
+ else last.next = el
+
+ last = el
+ }
+
def foreach(f: (Any, OutputChannel[Any]) => Unit) {
var curr = first
while (curr != null) {
@@ -70,6 +79,25 @@ private[actors] class MQueue(protected val label: String) {
}
}
+ def foreachAppend(target: MQueue) {
+ var curr = first
+ while (curr != null) {
+ target.append(curr)
+ curr = curr.next
+ }
+ }
+
+ def foreachDequeue(target: MQueue) {
+ var curr = first
+ while (curr != null) {
+ target.append(curr)
+ curr = curr.next
+ }
+ first = null
+ last = null
+ _size = 0
+ }
+
def foldLeft[B](z: B)(f: (B, Any) => B): B = {
var acc = z
var curr = first
@@ -108,6 +136,43 @@ private[actors] class MQueue(protected val label: String) {
def extractFirst(p: (Any, OutputChannel[Any]) => Boolean): MQueueElement =
removeInternal(0)(p) orNull
+ def extractFirst(pf: PartialFunction[Any, Any]): MQueueElement = {
+ if (isEmpty) // early return
+ return null
+
+ // special handling if returning the head
+ if (pf.isDefinedAt(first.msg)) {
+ val res = first
+ first = first.next
+ if (res eq last)
+ last = null
+
+ changeSize(-1)
+ res
+ }
+ else {
+ var curr = first.next // init to element #2
+ var prev = first
+
+ while (curr != null) {
+ if (pf.isDefinedAt(curr.msg)) {
+ prev.next = curr.next
+ if (curr eq last)
+ last = prev
+
+ changeSize(-1)
+ return curr // early return
+ }
+ else {
+ prev = curr
+ curr = curr.next
+ }
+ }
+ // not found
+ null
+ }
+ }
+
private def removeInternal(n: Int)(p: (Any, OutputChannel[Any]) => Boolean): Option[MQueueElement] = {
var pos = 0
diff --git a/src/actors/scala/actors/ReactChannel.scala b/src/actors/scala/actors/ReactChannel.scala
index 926805fbe7..8bbbc04f53 100644
--- a/src/actors/scala/actors/ReactChannel.scala
+++ b/src/actors/scala/actors/ReactChannel.scala
@@ -55,7 +55,7 @@ private[actors] class ReactChannel[Msg](receiver: Reactor) extends InputChannel[
*
* @param f a partial function with message patterns and actions
*/
- def react(f: Msg =>? Unit): Nothing = {
+ def react(f: PartialFunction[Msg, Unit]): Nothing = {
val C = this
receiver.react {
case SendToReactor(C, msg) if (f.isDefinedAt(msg.asInstanceOf[Msg])) =>
@@ -73,7 +73,7 @@ private[actors] class ReactChannel[Msg](receiver: Reactor) extends InputChannel[
* @param msec the time span before timeout
* @param f a partial function with message patterns and actions
*/
- def reactWithin(msec: Long)(f: Any =>? Unit): Nothing = {
+ def reactWithin(msec: Long)(f: PartialFunction[Any, Unit]): Nothing = {
val C = this
val recvActor = receiver.asInstanceOf[Actor]
recvActor.reactWithin(msec) {
@@ -89,7 +89,7 @@ private[actors] class ReactChannel[Msg](receiver: Reactor) extends InputChannel[
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receive[R](f: Msg =>? R): R = {
+ def receive[R](f: PartialFunction[Msg, R]): R = {
val C = this
val recvActor = receiver.asInstanceOf[Actor]
recvActor.receive {
@@ -106,7 +106,7 @@ private[actors] class ReactChannel[Msg](receiver: Reactor) extends InputChannel[
* @param f a partial function with message patterns and actions
* @return result of processing the received value
*/
- def receiveWithin[R](msec: Long)(f: Any =>? R): R = {
+ def receiveWithin[R](msec: Long)(f: PartialFunction[Any, R]): R = {
val C = this
val recvActor = receiver.asInstanceOf[Actor]
recvActor.receiveWithin(msec) {
diff --git a/src/actors/scala/actors/Reaction.scala b/src/actors/scala/actors/Reaction.scala
index a4736f9489..753dd7da83 100644
--- a/src/actors/scala/actors/Reaction.scala
+++ b/src/actors/scala/actors/Reaction.scala
@@ -26,7 +26,7 @@ private[actors] class KillActorException extends Throwable with ControlException
* @deprecated("this class is going to be removed in a future release")
* @author Philipp Haller
*/
-class Reaction(a: Actor, f: Any =>? Unit, msg: Any) extends ActorTask(a, () => {
+class Reaction(a: Actor, f: PartialFunction[Any, Any], msg: Any) extends ActorTask(a, () => {
if (f == null)
a.act()
else
diff --git a/src/actors/scala/actors/Reactor.scala b/src/actors/scala/actors/Reactor.scala
index 8545b92d1e..eb0485263b 100644
--- a/src/actors/scala/actors/Reactor.scala
+++ b/src/actors/scala/actors/Reactor.scala
@@ -35,25 +35,24 @@ trait Reactor extends OutputChannel[Any] {
private[actors] val mailbox = new MQueue("Reactor")
// guarded by this
- private[actors] val sendBuffer = new Queue[(Any, OutputChannel[Any])]
+ private[actors] val sendBuffer = new MQueue("SendBuffer")
- /* If the actor waits in a react, continuation holds the
- * message handler that react was called with.
- */
- @volatile
- private[actors] var continuation: Any =>? Unit = null
-
- /* Whenever this Actor executes on some thread, waitingFor is
+ /* Whenever this actor executes on some thread, waitingFor is
* guaranteed to be equal to waitingForNone.
*
* In other words, whenever waitingFor is not equal to
- * waitingForNone, this Actor is guaranteed not to execute on some
+ * waitingForNone, this actor is guaranteed not to execute on some
* thread.
*/
- private[actors] val waitingForNone = (m: Any) => false
+ private[actors] val waitingForNone = new PartialFunction[Any, Unit] {
+ def isDefinedAt(x: Any) = false
+ def apply(x: Any) {}
+ }
- // guarded by lock of this
- private[actors] var waitingFor: Any => Boolean = waitingForNone
+ /* If the actor waits in a react, waitingFor holds the
+ * message handler that react was called with.
+ */
+ private[actors] var waitingFor: PartialFunction[Any, Any] = waitingForNone // guarded by lock of this
/**
* The behavior of an actor is specified by implementing this
@@ -61,7 +60,7 @@ trait Reactor extends OutputChannel[Any] {
*/
def act(): Unit
- protected[actors] def exceptionHandler: Exception =>? Unit =
+ protected[actors] def exceptionHandler: PartialFunction[Exception, Unit] =
Map()
protected[actors] def scheduler: IScheduler =
@@ -84,14 +83,14 @@ trait Reactor extends OutputChannel[Any] {
waitingFor = waitingForNone
startSearch(msg, replyTo, savedWaitingFor)
} else {
- sendBuffer.enqueue((msg, replyTo))
+ sendBuffer.append(msg, replyTo)
() => { /* do nothing */ }
}
}
todo()
}
- private[actors] def startSearch(msg: Any, replyTo: OutputChannel[Any], handler: Any => Boolean) =
+ private[actors] def startSearch(msg: Any, replyTo: OutputChannel[Any], handler: PartialFunction[Any, Any]) =
() => scheduler execute (makeReaction(() => {
val startMbox = new MQueue("Start")
synchronized { startMbox.append(msg, replyTo) }
@@ -101,15 +100,11 @@ trait Reactor extends OutputChannel[Any] {
private[actors] def makeReaction(fun: () => Unit): Runnable =
new ReactorTask(this, fun)
- /* Note that this method is called without holding a lock.
- * Therefore, to read an up-to-date continuation, it must be @volatile.
- */
- private[actors] def resumeReceiver(item: (Any, OutputChannel[Any]), onSameThread: Boolean) {
- // assert continuation != null
+ private[actors] def resumeReceiver(item: (Any, OutputChannel[Any]), handler: PartialFunction[Any, Any], onSameThread: Boolean) {
if (onSameThread)
- continuation(item._1)
+ handler(item._1)
else {
- scheduleActor(continuation, item._1)
+ scheduleActor(handler, item._1)
/* Here, we throw a SuspendActorException to avoid
terminating this actor when the current ReactorTask
is finished.
@@ -133,22 +128,18 @@ trait Reactor extends OutputChannel[Any] {
// guarded by this
private[actors] def drainSendBuffer(mbox: MQueue) {
- while (!sendBuffer.isEmpty) {
- val item = sendBuffer.dequeue()
- mbox.append(item._1, item._2)
- }
+ sendBuffer.foreachDequeue(mbox)
}
- // assume continuation != null
private[actors] def searchMailbox(startMbox: MQueue,
- handlesMessage: Any => Boolean,
+ handler: PartialFunction[Any, Any],
resumeOnSameThread: Boolean) {
var tmpMbox = startMbox
var done = false
while (!done) {
- val qel = tmpMbox.extractFirst((msg: Any, replyTo: OutputChannel[Any]) => handlesMessage(msg))
+ val qel = tmpMbox.extractFirst(handler)
if (tmpMbox ne mailbox)
- tmpMbox.foreach((m, s) => mailbox.append(m, s))
+ tmpMbox.foreachAppend(mailbox)
if (null eq qel) {
synchronized {
// in mean time new stuff might have arrived
@@ -157,7 +148,7 @@ trait Reactor extends OutputChannel[Any] {
drainSendBuffer(tmpMbox)
// keep going
} else {
- waitingFor = handlesMessage
+ waitingFor = handler
/* Here, we throw a SuspendActorException to avoid
terminating this actor when the current ReactorTask
is finished.
@@ -169,17 +160,16 @@ trait Reactor extends OutputChannel[Any] {
}
}
} else {
- resumeReceiver((qel.msg, qel.session), resumeOnSameThread)
+ resumeReceiver((qel.msg, qel.session), handler, resumeOnSameThread)
done = true
}
}
}
- protected[actors] def react(f: Any =>? Unit): Nothing = {
+ protected[actors] def react(f: PartialFunction[Any, Unit]): Nothing = {
assert(Actor.rawSelf(scheduler) == this, "react on channel belonging to other actor")
synchronized { drainSendBuffer(mailbox) }
- continuation = f
- searchMailbox(mailbox, f.isDefinedAt, false)
+ searchMailbox(mailbox, f, false)
throw Actor.suspendException
}
@@ -190,7 +180,7 @@ trait Reactor extends OutputChannel[Any] {
*
* never throws SuspendActorException
*/
- private[actors] def scheduleActor(handler: Any =>? Unit, msg: Any) = {
+ private[actors] def scheduleActor(handler: PartialFunction[Any, Any], msg: Any) = {
val fun = () => handler(msg)
val task = new ReactorTask(this, fun)
scheduler executeFromActor task
diff --git a/src/actors/scala/actors/ReactorTask.scala b/src/actors/scala/actors/ReactorTask.scala
index f6ec67e94c..37aec0f8ec 100644
--- a/src/actors/scala/actors/ReactorTask.scala
+++ b/src/actors/scala/actors/ReactorTask.scala
@@ -20,7 +20,7 @@ import java.util.concurrent.Callable
*
* @author Philipp Haller
*/
-private[actors] class ReactorTask[T >: Null <: Reactor](var reactor: T, var fun: () => Unit)
+private[actors] class ReactorTask[T >: Null <: Reactor](var reactor: T, var fun: () => Any)
extends Callable[Unit] with Runnable {
def run() {
diff --git a/src/actors/scala/actors/ReplyReactor.scala b/src/actors/scala/actors/ReplyReactor.scala
index 64860f4d38..d5936ae662 100644
--- a/src/actors/scala/actors/ReplyReactor.scala
+++ b/src/actors/scala/actors/ReplyReactor.scala
@@ -52,28 +52,26 @@ trait ReplyReactor extends Reactor with ReplyableReactor {
send(msg, Actor.sender)
}
- private[actors] override def resumeReceiver(item: (Any, OutputChannel[Any]), onSameThread: Boolean) {
+ private[actors] override def resumeReceiver(item: (Any, OutputChannel[Any]), handler: PartialFunction[Any, Any], onSameThread: Boolean) {
senders = List(item._2)
- // assert continuation != null
if (onSameThread)
- continuation(item._1)
+ handler(item._1)
else {
- scheduleActor(continuation, item._1)
+ scheduleActor(handler, item._1)
// see Reactor.resumeReceiver
throw Actor.suspendException
}
}
- // assume continuation != null
private[actors] override def searchMailbox(startMbox: MQueue,
- handlesMessage: Any => Boolean,
+ handler: PartialFunction[Any, Any],
resumeOnSameThread: Boolean) {
var tmpMbox = startMbox
var done = false
while (!done) {
val qel = tmpMbox.extractFirst((msg: Any, replyTo: OutputChannel[Any]) => {
senders = List(replyTo)
- handlesMessage(msg)
+ handler.isDefinedAt(msg)
})
if (tmpMbox ne mailbox)
tmpMbox.foreach((m, s) => mailbox.append(m, s))
@@ -85,13 +83,13 @@ trait ReplyReactor extends Reactor with ReplyableReactor {
drainSendBuffer(tmpMbox)
// keep going
} else {
- waitingFor = handlesMessage
+ waitingFor = handler
// see Reactor.searchMailbox
throw Actor.suspendException
}
}
} else {
- resumeReceiver((qel.msg, qel.session), resumeOnSameThread)
+ resumeReceiver((qel.msg, qel.session), handler, resumeOnSameThread)
done = true
}
}
diff --git a/src/actors/scala/actors/Replyable.scala b/src/actors/scala/actors/Replyable.scala
index b1ccb3205e..2c7e55e06a 100644
--- a/src/actors/scala/actors/Replyable.scala
+++ b/src/actors/scala/actors/Replyable.scala
@@ -59,7 +59,7 @@ trait Replyable[-T, +R] {
* @param f the function to be applied to the response
* @return the future
*/
- def !![P](msg: T, f: R =>? P): () => P =
+ def !![P](msg: T, f: PartialFunction[R, P]): () => P =
() => f(this !? msg)
}
diff --git a/src/actors/scala/actors/ReplyableActor.scala b/src/actors/scala/actors/ReplyableActor.scala
index b562dbf855..2122dd854b 100644
--- a/src/actors/scala/actors/ReplyableActor.scala
+++ b/src/actors/scala/actors/ReplyableActor.scala
@@ -62,7 +62,7 @@ private[actors] trait ReplyableActor extends ReplyableReactor {
* <code>f</code>. This also allows to recover a more
* precise type for the reply value.
*/
- override def !![A](msg: Any, f: Any =>? A): Future[A] = {
+ override def !![A](msg: Any, f: PartialFunction[Any, A]): Future[A] = {
val ftch = new Channel[A](Actor.self(thiz.scheduler))
thiz.send(msg, new OutputChannel[Any] {
def !(msg: Any) =
@@ -108,7 +108,7 @@ private[actors] trait ReplyableActor extends ReplyableReactor {
Futures.fromInputChannel(someChan)
}
// should never be invoked; return dummy value
- override def !![A](msg: Any, f: Any =>? A): Future[A] = {
+ override def !![A](msg: Any, f: PartialFunction[Any, A]): Future[A] = {
val someChan = new Channel[A](Actor.self(thiz.scheduler))
Futures.fromInputChannel(someChan)
}
@@ -117,7 +117,7 @@ private[actors] trait ReplyableActor extends ReplyableReactor {
thiz.send(msg, linkedChannel)
new Future[Any](ftch) {
var exitReason: Option[Any] = None
- val handleReply: Any =>? Unit = {
+ val handleReply: PartialFunction[Any, Unit] = {
case Exit(from, reason) =>
exitReason = Some(reason)
case any =>
@@ -145,7 +145,7 @@ private[actors] trait ReplyableActor extends ReplyableReactor {
def isSet = (fvalue match {
case None =>
- val handleTimeout: Any =>? Boolean = {
+ val handleTimeout: PartialFunction[Any, Boolean] = {
case TIMEOUT =>
false
}
diff --git a/src/actors/scala/actors/ReplyableReactor.scala b/src/actors/scala/actors/ReplyableReactor.scala
index f5a2752f54..ecca50e26d 100644
--- a/src/actors/scala/actors/ReplyableReactor.scala
+++ b/src/actors/scala/actors/ReplyableReactor.scala
@@ -70,7 +70,7 @@ private[actors] trait ReplyableReactor extends Replyable[Any, Any] {
* <code>f</code>. This also allows to recover a more
* precise type for the reply value.
*/
- override def !![A](msg: Any, f: Any =>? A): Future[A] = {
+ override def !![A](msg: Any, f: PartialFunction[Any, A]): Future[A] = {
val myself = Actor.rawSelf(this.scheduler)
val ftch = new ReactChannel[A](myself)
val res = new scala.concurrent.SyncVar[A]
diff --git a/src/actors/scala/actors/remote/Proxy.scala b/src/actors/scala/actors/remote/Proxy.scala
index c1744a2dfc..f9a6cd8fed 100644
--- a/src/actors/scala/actors/remote/Proxy.scala
+++ b/src/actors/scala/actors/remote/Proxy.scala
@@ -69,7 +69,7 @@ private[remote] class Proxy(node: Node, name: Symbol, @transient var kernel: Net
override def !!(msg: Any): Future[Any] =
del !! msg
- override def !![A](msg: Any, f: Any =>? A): Future[A] =
+ override def !![A](msg: Any, f: PartialFunction[Any, A]): Future[A] =
del !! (msg, f)
def linkTo(to: AbstractActor): Unit =
diff --git a/src/compiler/scala/tools/ant/sabbus/Settings.scala b/src/compiler/scala/tools/ant/sabbus/Settings.scala
index 134b3a4bb9..2719196095 100644
--- a/src/compiler/scala/tools/ant/sabbus/Settings.scala
+++ b/src/compiler/scala/tools/ant/sabbus/Settings.scala
@@ -14,7 +14,6 @@ import java.io.File
import org.apache.tools.ant.types.{Path, Reference}
-@cloneable
class Settings {
private var gBf: Option[String] = None
diff --git a/src/compiler/scala/tools/nsc/Interpreter.scala b/src/compiler/scala/tools/nsc/Interpreter.scala
index 55e3b354e5..1655c0130d 100644
--- a/src/compiler/scala/tools/nsc/Interpreter.scala
+++ b/src/compiler/scala/tools/nsc/Interpreter.scala
@@ -103,6 +103,18 @@ class Interpreter(val settings: Settings, out: PrintWriter)
}
}
+ /** whether to bind the lastException variable */
+ private var bindLastException = true
+
+ /** Temporarily stop binding lastException */
+ def withoutBindingLastException[T](operation: => T): T = {
+ val wasBinding = bindLastException
+ ultimately(bindLastException = wasBinding) {
+ bindLastException = false
+ operation
+ }
+ }
+
/** interpreter settings */
lazy val isettings = new InterpreterSettings(this)
@@ -477,13 +489,13 @@ class Interpreter(val settings: Settings, out: PrintWriter)
val binderObject = loadByName(binderName)
val setterMethod = methodByName(binderObject, "set")
- // this roundabout approach is to ensure the value is boxed
- var argsHolder: Array[Any] = null
- argsHolder = List(value).toArray
- setterMethod.invoke(null, argsHolder.asInstanceOf[Array[AnyRef]]: _*)
+ setterMethod.invoke(null, value.asInstanceOf[AnyRef])
interpret("val %s = %s.value".format(name, binderName))
}
+ def quietBind(name: String, boundType: String, value: Any): IR.Result =
+ beQuietDuring { bind(name, boundType, value) }
+
/** Reset this interpreter, forgetting all user-specified requests. */
def reset() {
virtualDirectory.clear
@@ -505,12 +517,14 @@ class Interpreter(val settings: Settings, out: PrintWriter)
/** A traverser that finds all mentioned identifiers, i.e. things
* that need to be imported. It might return extra names.
*/
- private class ImportVarsTraverser(definedVars: List[Name]) extends Traverser {
+ private class ImportVarsTraverser extends Traverser {
val importVars = new HashSet[Name]()
override def traverse(ast: Tree) = ast match {
- case Ident(name) => importVars += name
- case _ => super.traverse(ast)
+ // XXX this is obviously inadequate but it's going to require some effort
+ // to get right.
+ case Ident(name) if !(name.toString startsWith "x$") => importVars += name
+ case _ => super.traverse(ast)
}
}
@@ -518,9 +532,9 @@ class Interpreter(val settings: Settings, out: PrintWriter)
* in a single interpreter request.
*/
private sealed abstract class MemberHandler(val member: Tree) {
- val usedNames: List[Name] = {
- val ivt = new ImportVarsTraverser(boundNames)
- ivt.traverseTrees(List(member))
+ lazy val usedNames: List[Name] = {
+ val ivt = new ImportVarsTraverser()
+ ivt traverse member
ivt.importVars.toList
}
def boundNames: List[Name] = Nil
@@ -535,6 +549,7 @@ class Interpreter(val settings: Settings, out: PrintWriter)
def extraCodeToEvaluate(req: Request, code: PrintWriter) { }
def resultExtractionCode(req: Request, code: PrintWriter) { }
+ override def toString = "%s(usedNames = %s)".format(this.getClass, usedNames)
}
private class GenericHandler(member: Tree) extends MemberHandler(member)
@@ -788,9 +803,13 @@ class Interpreter(val settings: Settings, out: PrintWriter)
val wrapperExceptions: List[Class[_ <: Throwable]] =
List(classOf[InvocationTargetException], classOf[ExceptionInInitializerError])
- def onErr: Catcher[(String, Boolean)] = { case t: Throwable =>
- beQuietDuring { bind("lastException", "java.lang.Throwable", t) }
- (stringFrom(t.printStackTrace(_)), false)
+ /** We turn off the binding to accomodate ticket #2817 */
+ def onErr: Catcher[(String, Boolean)] = {
+ case t: Throwable if bindLastException =>
+ withoutBindingLastException {
+ quietBind("lastException", "java.lang.Throwable", t)
+ (stringFrom(t.printStackTrace(_)), false)
+ }
}
catching(onErr) {
diff --git a/src/compiler/scala/tools/nsc/InterpreterLoop.scala b/src/compiler/scala/tools/nsc/InterpreterLoop.scala
index 76f09f07f7..2b926d8e80 100644
--- a/src/compiler/scala/tools/nsc/InterpreterLoop.scala
+++ b/src/compiler/scala/tools/nsc/InterpreterLoop.scala
@@ -10,6 +10,7 @@ import java.io.{ BufferedReader, File, FileReader, PrintWriter }
import java.io.IOException
import scala.tools.nsc.{ InterpreterResults => IR }
+import scala.collection.JavaConversions.asBuffer
import interpreter._
import io.{ Process }
@@ -22,30 +23,34 @@ object InterpreterControl {
// a single interpreter command
sealed abstract class Command extends Function1[List[String], Result] {
- val name: String
- val help: String
+ def name: String
+ def help: String
def error(msg: String) = {
println(":" + name + " " + msg + ".")
Result(true, None)
}
- def getHelp(): String = ":" + name + " " + help + "."
+ def usage(): String
}
case class NoArgs(name: String, help: String, f: () => Result) extends Command {
+ def usage(): String = ":" + name
def apply(args: List[String]) = if (args.isEmpty) f() else error("accepts no arguments")
}
case class LineArg(name: String, help: String, f: (String) => Result) extends Command {
+ def usage(): String = ":" + name + " <line>"
def apply(args: List[String]) = f(args mkString " ")
}
case class OneArg(name: String, help: String, f: (String) => Result) extends Command {
+ def usage(): String = ":" + name + " <arg>"
def apply(args: List[String]) =
if (args.size == 1) f(args.head)
else error("requires exactly one argument")
}
case class VarArgs(name: String, help: String, f: (List[String]) => Result) extends Command {
+ def usage(): String = ":" + name + " [arg]"
def apply(args: List[String]) = f(args)
}
@@ -54,8 +59,6 @@ object InterpreterControl {
}
import InterpreterControl._
-// import scala.concurrent.ops.defaultRunner
-
/** The
* <a href="http://scala-lang.org/" target="_top">Scala</a>
* interactive shell. It provides a read-eval-print loop around
@@ -76,6 +79,12 @@ class InterpreterLoop(in0: Option[BufferedReader], out: PrintWriter) {
/** The input stream from which commands come, set by main() */
var in: InteractiveReader = _
+ def history = in match {
+ case x: JLineReader => Some(x.history)
+ case _ => None
+ }
+ def historyList: Seq[String] =
+ history map (x => asBuffer(x.getHistoryList): Seq[String]) getOrElse Nil
/** The context class loader at the time this object was created */
protected val originalClassLoader = Thread.currentThread.getContextClassLoader
@@ -126,8 +135,11 @@ class InterpreterLoop(in0: Option[BufferedReader], out: PrintWriter) {
/** print a friendly help message */
def printHelp() = {
- out println "All commands can be abbreviated - for example :h or :he instead of :help.\n"
- commands foreach { c => out println c.getHelp }
+ out println "All commands can be abbreviated - for example :he instead of :help.\n"
+ val cmds = commands map (x => (x.usage, x.help))
+ val width: Int = cmds map { case (x, _) => x.length } max
+ val formatStr = "%-" + width + "s %s"
+ cmds foreach { case (usage, help) => out println formatStr.format(usage, help) }
}
/** Print a welcome message */
@@ -143,6 +155,36 @@ class InterpreterLoop(in0: Option[BufferedReader], out: PrintWriter) {
out.flush
}
+ /** Show the history */
+ def printHistory(xs: List[String]) {
+ val defaultLines = 20
+
+ if (history.isEmpty)
+ return println("No history available.")
+
+ val current = history.get.getCurrentIndex
+ val count = try xs.head.toInt catch { case _: Exception => defaultLines }
+ val lines = historyList takeRight count
+ val offset = current - lines.size + 1
+
+ for ((line, index) <- lines.zipWithIndex)
+ println("%d %s".format(index + offset, line))
+ }
+
+ /** Search the history */
+ def searchHistory(_cmdline: String) {
+ val cmdline = _cmdline.toLowerCase
+
+ if (history.isEmpty)
+ return println("No history available.")
+
+ val current = history.get.getCurrentIndex
+ val offset = current - historyList.size + 1
+
+ for ((line, index) <- historyList.zipWithIndex ; if line.toLowerCase contains cmdline)
+ println("%d %s".format(index + offset, line))
+ }
+
/** Prompt to print when awaiting input */
val prompt = Properties.shellPromptString
@@ -160,13 +202,15 @@ class InterpreterLoop(in0: Option[BufferedReader], out: PrintWriter) {
val standardCommands: List[Command] = {
import CommandImplicits._
List(
- NoArgs("help", "prints this help message", printHelp),
+ NoArgs("help", "print this help message", printHelp),
+ VarArgs("history", "show the history (optional arg: lines to show)", printHistory),
+ LineArg("h?", "search the history", searchHistory),
OneArg("jar", "add a jar to the classpath", addJar),
- OneArg("load", "followed by a filename loads a Scala file", load),
+ OneArg("load", "load and interpret a Scala file", load),
NoArgs("power", "enable power user mode", power),
- NoArgs("quit", "exits the interpreter", () => Result(false, None)),
- NoArgs("replay", "resets execution and replays all previous commands", replay),
- LineArg("sh", "forks a shell and runs a command", runShellCmd),
+ NoArgs("quit", "exit the interpreter", () => Result(false, None)),
+ NoArgs("replay", "reset execution and replay all previous commands", replay),
+ LineArg("sh", "fork a shell and run a command", runShellCmd),
NoArgs("silent", "disable/enable automatic printing of results", verbosity)
)
}
@@ -296,9 +340,10 @@ class InterpreterLoop(in0: Option[BufferedReader], out: PrintWriter) {
replay()
}
- def power() = {
+ def power() {
powerUserOn = true
interpreter.powerUser()
+ interpreter.quietBind("history", "scala.collection.immutable.List[String]", historyList.toList)
}
def verbosity() = {
@@ -381,7 +426,7 @@ class InterpreterLoop(in0: Option[BufferedReader], out: PrintWriter) {
// the interpeter is passed as an argument to expose tab completion info
if (settings.Xnojline.value || emacsShell) new SimpleReader
else if (settings.noCompletion.value) InteractiveReader.createDefault()
- else InteractiveReader.createDefault(interpreter)
+ else InteractiveReader.createDefault(interpreter, this)
}
loadFiles(settings)
@@ -399,7 +444,7 @@ class InterpreterLoop(in0: Option[BufferedReader], out: PrintWriter) {
// injects one value into the repl; returns pair of name and class
def injectOne(name: String, obj: Any): Tuple2[String, String] = {
val className = obj.asInstanceOf[AnyRef].getClass.getName
- interpreter.bind(name, className, obj)
+ interpreter.quietBind(name, className, obj)
(name, className)
}
diff --git a/src/compiler/scala/tools/nsc/Settings.scala b/src/compiler/scala/tools/nsc/Settings.scala
index b039a9e90d..3bd8c98664 100644
--- a/src/compiler/scala/tools/nsc/Settings.scala
+++ b/src/compiler/scala/tools/nsc/Settings.scala
@@ -399,7 +399,7 @@ object Settings {
case None =>
(outputs filter (isBelow _).tuple) match {
case Nil => Nil
- case matches => matches.map(_._1.lookupPath(srcPath, false))
+ case matches => matches.map(_._1.lookupPathUnchecked(srcPath, false))
}
}
}
diff --git a/src/compiler/scala/tools/nsc/ast/DocComments.scala b/src/compiler/scala/tools/nsc/ast/DocComments.scala
index bbd515d40c..a6792b3ba7 100755
--- a/src/compiler/scala/tools/nsc/ast/DocComments.scala
+++ b/src/compiler/scala/tools/nsc/ast/DocComments.scala
@@ -68,7 +68,7 @@ trait DocComments { self: SymbolTable =>
/** The list of use cases of doc comment of symbol `sym` seen as a member of class
* `site`. Each use case consists of a synthetic symbol (which is entered nowhere else),
- * and an expanded doc comment string.
+ * of an expanded doc comment string, and of its position.
*
* @param sym The symbol for which use cases are returned
* @param site The class for which doc comments are generated
@@ -76,16 +76,17 @@ trait DocComments { self: SymbolTable =>
* of the same string are done, which is
* interpreted as a recursive variable definition.
*/
- def useCases(sym: Symbol, site: Symbol): List[(Symbol, String)] = {
+ def useCases(sym: Symbol, site: Symbol): List[(Symbol, String, Position)] = {
def getUseCases(dc: DocComment) = {
for (uc <- dc.useCases; defn <- uc.expandedDefs(site)) yield
(defn,
- expandVariables(merge(cookedDocComment(sym), uc.comment.raw, defn, copyFirstPara = true), sym, site))
+ expandVariables(merge(cookedDocComment(sym), uc.comment.raw, defn, copyFirstPara = true), sym, site),
+ uc.pos)
}
getDocComment(sym) map getUseCases getOrElse List()
}
- def useCases(sym: Symbol): List[(Symbol, String)] = useCases(sym, sym)
+ def useCases(sym: Symbol): List[(Symbol, String, Position)] = useCases(sym, sym)
/** Returns the javadoc format of doc comment string `s`, including wiki expansion
*/
@@ -357,6 +358,7 @@ trait DocComments { self: SymbolTable =>
}
}
val parts = getParts(0)
+ assert(parts.length > 0, "parts is empty '" + str + "' in site " + site)
val partnames = (parts.init map newTermName) ::: List(newTypeName(parts.last))
val (start, rest) =
if (parts.head == "this")
@@ -375,7 +377,7 @@ trait DocComments { self: SymbolTable =>
for (alias <- aliases) yield
lookupVariable(alias.name.toString.substring(1), site) match {
case Some(repl) =>
- val tpe = getType(repl)
+ val tpe = getType(repl.trim)
if (tpe != NoType) tpe
else {
val alias1 = alias.cloneSymbol(definitions.RootClass)
@@ -406,7 +408,7 @@ trait DocComments { self: SymbolTable =>
}
for (defn <- defined) yield {
- defn.cloneSymbol(site).setInfo(
+ defn.cloneSymbol(site).setFlag(Flags.SYNTHETIC).setInfo(
substAliases(defn.info).asSeenFrom(site.thisType, defn.owner))
}
}
diff --git a/src/compiler/scala/tools/nsc/ast/parser/MarkupParsers.scala b/src/compiler/scala/tools/nsc/ast/parser/MarkupParsers.scala
index bd46d2219d..1f17f148aa 100644
--- a/src/compiler/scala/tools/nsc/ast/parser/MarkupParsers.scala
+++ b/src/compiler/scala/tools/nsc/ast/parser/MarkupParsers.scala
@@ -36,6 +36,9 @@ trait MarkupParsers
{
self: Parsers =>
+ type PositionType = Position
+ type InputType = CharArrayReader
+
case object MissingEndTagException extends RuntimeException with ControlException {
override def getMessage = "start tag was here: "
}
@@ -62,6 +65,8 @@ trait MarkupParsers
else reportSyntaxError(msg)
var input : CharArrayReader = _
+ def lookahead(): BufferedIterator[Char] =
+ (input.buf drop input.charOffset).iterator.buffered
import parser.{ symbXMLBuilder => handle, o2p, r2p }
@@ -83,7 +88,6 @@ trait MarkupParsers
private var debugLastStartElement = new mutable.Stack[(Int, String)]
private def debugLastPos = debugLastStartElement.top._1
private def debugLastElem = debugLastStartElement.top._2
- private def unreachable = Predef.error("Cannot be reached.")
private def errorBraces() = {
reportSyntaxError("in XML content, please use '}}' to express '}'")
@@ -190,55 +194,13 @@ trait MarkupParsers
xToken('>')
}
- /** Create a non-destructive lookahead reader and see if the head
- * of the input would match the given String. If yes, return true
- * and drop the entire String from input; if no, return false
- * and leave input unchanged.
- */
- private def peek(lookingFor: String): Boolean = {
- val la = input.lookaheadReader
- for (c <- lookingFor) {
- la.nextChar()
- if (la.ch != c)
- return false
- }
- // drop the chars from the real reader (all lookahead + orig)
- (0 to lookingFor.length) foreach (_ => nextch)
- true
- }
-
- /** Take characters from input stream until given String "until"
- * is seen. Once seen, the accumulated characters are passed
- * along with the current Position to the supplied handler function.
- */
- private def xTakeUntil[T](
- handler: (Position, String) => T,
- positioner: () => Position,
- until: String): T =
- {
- val sb = new StringBuilder
- val head = until charAt 0
- val rest = until drop 1
-
- while (true) {
- if (ch == head && peek(rest))
- return handler(positioner(), sb.toString)
- else if (ch == SU)
- throw TruncatedXML
-
- sb append ch
- nextch
- }
- unreachable
- }
-
/** '<! CharData ::= [CDATA[ ( {char} - {char}"]]>"{char} ) ']]>'
*
* see [15]
*/
def xCharData: Tree = {
val start = curOffset
- "[CDATA[" foreach xToken
+ xToken("[CDATA[")
val mid = curOffset
xTakeUntil(handle.charData, () => r2p(start, mid, curOffset), "]]>")
}
@@ -284,7 +246,7 @@ trait MarkupParsers
*/
def xComment: Tree = {
val start = curOffset - 2 // Rewinding to include "<!"
- "--" foreach xToken
+ xToken("--")
xTakeUntil(handle.comment, () => r2p(start, start, curOffset), "-->")
}
@@ -374,7 +336,7 @@ trait MarkupParsers
val start = curOffset
val (qname, attrMap) = xTag
if (ch == '/') { // empty element
- "/>" foreach xToken
+ xToken("/>")
handle.element(r2p(start, start, curOffset), qname, attrMap, new ListBuffer[Tree])
}
else { // handle content
diff --git a/src/compiler/scala/tools/nsc/ast/parser/Parsers.scala b/src/compiler/scala/tools/nsc/ast/parser/Parsers.scala
index a89f4f01c3..b86356ba94 100644
--- a/src/compiler/scala/tools/nsc/ast/parser/Parsers.scala
+++ b/src/compiler/scala/tools/nsc/ast/parser/Parsers.scala
@@ -421,14 +421,19 @@ self =>
def joinComment(trees: => List[Tree]): List[Tree] = {
val doc = in.flushDoc
if ((doc ne null) && doc.raw.length > 0) {
- val ts = trees
- val main = ts.find(_.pos.isOpaqueRange)
- ts map {
+ val joined = trees map {
t =>
val dd = DocDef(doc, t)
- val pos = doc.pos.withEnd(t.pos.endOrPoint)
- dd setPos (if (t eq main) pos else pos.makeTransparent)
+ val defnPos = t.pos
+ val pos = doc.pos.withEnd(defnPos.endOrPoint)
+ dd setPos (if (defnPos.isOpaqueRange) pos else pos.makeTransparent)
}
+ joined.find(_.pos.isOpaqueRange) foreach {
+ main =>
+ val mains = List(main)
+ joined foreach { t => if (t ne main) ensureNonOverlapping(t, mains) }
+ }
+ joined
}
else trees
}
@@ -2460,14 +2465,15 @@ self =>
val stats = new ListBuffer[Tree]
while (in.token != RBRACE && in.token != EOF) {
if (in.token == PACKAGE) {
+ in.flushDoc
val start = in.skipToken()
stats += {
if (in.token == OBJECT) makePackageObject(start, objectDef(in.offset, NoMods))
else packaging(start)
}
} else if (in.token == IMPORT) {
+ in.flushDoc
stats ++= importClause()
- // XXX: IDE hook this all.
} else if (in.token == CLASS ||
in.token == CASECLASS ||
in.token == TRAIT ||
@@ -2498,6 +2504,7 @@ self =>
var self: ValDef = emptyValDef
val stats = new ListBuffer[Tree]
if (isExprIntro) {
+ in.flushDoc
val first = expr(InTemplate) // @S: first statement is potentially converted so cannot be stubbed.
if (in.token == ARROW) {
first match {
@@ -2518,8 +2525,10 @@ self =>
}
while (in.token != RBRACE && in.token != EOF) {
if (in.token == IMPORT) {
+ in.flushDoc
stats ++= importClause()
} else if (isExprIntro) {
+ in.flushDoc
stats += statement(InTemplate)
} else if (isDefIntro || isModifier || in.token == LBRACKET /*todo: remove */ || in.token == AT) {
stats ++= joinComment(nonLocalDefOrDcl)
@@ -2618,6 +2627,7 @@ self =>
while (in.token == SEMI) in.nextToken()
val start = in.offset
if (in.token == PACKAGE) {
+ in.flushDoc
in.nextToken()
if (in.token == OBJECT) {
ts += makePackageObject(start, objectDef(in.offset, NoMods))
@@ -2645,10 +2655,14 @@ self =>
}
ts.toList
}
- val start = caseAwareTokenOffset max 0
topstats() match {
case List(stat @ PackageDef(_, _)) => stat
- case stats => makePackaging(start, atPos(o2p(start)) { Ident(nme.EMPTY_PACKAGE_NAME) }, stats)
+ case stats =>
+ val start = stats match {
+ case Nil => 0
+ case _ => wrappingPos(stats).startOrPoint
+ }
+ makePackaging(start, atPos(start, start, start) { Ident(nme.EMPTY_PACKAGE_NAME) }, stats)
}
}
}
diff --git a/src/compiler/scala/tools/nsc/ast/parser/Scanners.scala b/src/compiler/scala/tools/nsc/ast/parser/Scanners.scala
index be90a835f5..6cb7c8b99f 100644
--- a/src/compiler/scala/tools/nsc/ast/parser/Scanners.scala
+++ b/src/compiler/scala/tools/nsc/ast/parser/Scanners.scala
@@ -166,6 +166,11 @@ trait Scanners {
sepRegions = sepRegions.tail
case _ =>
}
+ (lastToken: @switch) match {
+ case RBRACE | RBRACKET | RPAREN =>
+ docBuffer = null
+ case _ =>
+ }
// Read a token or copy it from `next` tokenData
if (next.token == EMPTY) {
diff --git a/src/compiler/scala/tools/nsc/backend/icode/GenICode.scala b/src/compiler/scala/tools/nsc/backend/icode/GenICode.scala
index cd690097e8..2f66f672d8 100644
--- a/src/compiler/scala/tools/nsc/backend/icode/GenICode.scala
+++ b/src/compiler/scala/tools/nsc/backend/icode/GenICode.scala
@@ -362,7 +362,7 @@ abstract class GenICode extends SubComponent {
thenCtx = genLoad(thenp, thenCtx, resKind)
elseCtx = genLoad(elsep, elseCtx, resKind)
- assert(!settings.debug.value || expectedType == UNIT,
+ assert(!settings.debug.value || !(hasUnitBranch && expectedType != UNIT),
"I produce UNIT in a context where " + expectedType + " is expected!")
thenCtx.bb.emitOnly(JUMP(contCtx.bb))
@@ -1410,7 +1410,14 @@ abstract class GenICode extends SubComponent {
assert(ctx.clazz.symbol eq cls,
"Classes are not the same: " + ctx.clazz.symbol + ", " + cls)
- for (f <- cls.info.decls ; if !f.isMethod && f.isTerm)
+ /** Non-method term members are fields, except for moudle members. Module
+ * members can only happen on .NET (no flatten) for inner traits. There,
+ * a module symbol is generated (transformInfo in mixin) which is used
+ * as owner for the members of the implementation class (so that the
+ * backend emits them as static).
+ * No code is needed for this module symbol.
+ */
+ for (f <- cls.info.decls ; if !f.isMethod && f.isTerm && !f.isModule)
ctx.clazz addField new IField(f)
}
diff --git a/src/compiler/scala/tools/nsc/backend/icode/analysis/CopyPropagation.scala b/src/compiler/scala/tools/nsc/backend/icode/analysis/CopyPropagation.scala
index 4f0da17bb3..8a65875fbf 100644
--- a/src/compiler/scala/tools/nsc/backend/icode/analysis/CopyPropagation.scala
+++ b/src/compiler/scala/tools/nsc/backend/icode/analysis/CopyPropagation.scala
@@ -146,10 +146,12 @@ abstract class CopyPropagation {
target match {
case Deref(LocalVar(l)) =>
val alias = getAlias(l)
+ val derefAlias = Deref(LocalVar(alias))
getBinding(alias) match {
- case Record(_, _) => Some(Deref(LocalVar(alias)))
+ case Record(_, _) => Some(derefAlias)
case Deref(Field(r1, f1)) =>
- getFieldNonRecordValue(r1, f1) orElse Some(Deref(LocalVar(alias)))
+ getFieldNonRecordValue(r1, f1) orElse Some(derefAlias)
+ case Boxed(_) => Some(derefAlias)
case v => Some(v)
}
case Deref(Field(r1, f1)) =>
diff --git a/src/compiler/scala/tools/nsc/backend/opt/Inliners.scala b/src/compiler/scala/tools/nsc/backend/opt/Inliners.scala
index 4c9a996cec..016e70a968 100644
--- a/src/compiler/scala/tools/nsc/backend/opt/Inliners.scala
+++ b/src/compiler/scala/tools/nsc/backend/opt/Inliners.scala
@@ -315,7 +315,7 @@ abstract class Inliners extends SubComponent {
i match {
case CALL_METHOD(msym, Dynamic) =>
def warnNoInline(reason: String) = {
- if (msym.hasAnnotation(ScalaInlineAttr))
+ if (msym.hasAnnotation(ScalaInlineAttr) && !m.symbol.hasFlag(Flags.BRIDGE))
currentIClazz.cunit.warning(i.pos,
"Could not inline required method %s because %s.".format(msym.originalName.decode, reason))
}
diff --git a/src/compiler/scala/tools/nsc/dependencies/Changes.scala b/src/compiler/scala/tools/nsc/dependencies/Changes.scala
index 87e38d8909..43efd0726b 100644
--- a/src/compiler/scala/tools/nsc/dependencies/Changes.scala
+++ b/src/compiler/scala/tools/nsc/dependencies/Changes.scala
@@ -92,7 +92,7 @@ abstract class Changes {
// new dependent types: probably fix this, use substSym as done for PolyType
(sameTypes(tp1.paramTypes, tp2.paramTypes) &&
((tp1.params, tp2.params).zipped forall ((t1, t2) =>
- (sameSymbol(t1, t1) && sameFlags(t1, t2)))) &&
+ (sameSymbol(t1, t2) && sameFlags(t1, t2)))) &&
sameType(res1, res2) &&
tp1.isInstanceOf[ImplicitMethodType] == tp2.isInstanceOf[ImplicitMethodType])
@@ -101,7 +101,7 @@ abstract class Changes {
case (ExistentialType(tparams1, res1), ExistentialType(tparams2, res2)) =>
sameTypeParams(tparams1, tparams2) && sameType(res1, res2)
case (TypeBounds(lo1, hi1), TypeBounds(lo2, hi2)) =>
- sameType(lo1, lo2) && sameType(hi1, hi2)
+ sameType(lo1, lo2) && sameType(hi1, hi2)
case (BoundedWildcardType(bounds), _) =>
bounds containsType tp2
case (_, BoundedWildcardType(bounds)) =>
@@ -133,40 +133,46 @@ abstract class Changes {
}
private def sameTypeParams(tparams1: List[Symbol], tparams2: List[Symbol]) =
- sameTypes(tparams1 map (_.info), tparams2 map (_.info))
+ sameTypes(tparams1 map (_.info), tparams2 map (_.info)) &&
+ sameTypes(tparams1 map (_.tpe), tparams2 map (_.tpe))
def sameTypes(tps1: List[Type], tps2: List[Type]): Boolean =
(tps1.length == tps2.length) && ((tps1, tps2).zipped forall sameType)
- /** Return the list of changes between 'from' and 'to'.
+ /** Return the list of changes between 'from' and 'toSym.info'.
*/
- def changeSet(from: Symbol, to: Symbol): List[Change] = {
+ def changeSet(from: Type, toSym: Symbol): List[Change] = {
implicit val defaultReason = "types"
-// println("changeSet " + from + "(" + from.info + ")"
-// + " vs " + to + "(" + to.info + ")")
+ val to = toSym.info
def omitSymbols(s: Symbol): Boolean = !s.hasFlag(LOCAL | LIFTED | PRIVATE)
val cs = new mutable.ListBuffer[Change]
- if ((from.info.parents zip to.info.parents) exists { case (t1, t2) => !sameType(t1, t2) })
- cs += Changed(toEntity(from))(from.info.parents.zip(to.info.parents).toString)
- if (from.typeParams != to.typeParams)
- cs += Changed(toEntity(from))(" tparams: " + from.typeParams.zip(to.typeParams))
+ if ((from.parents zip to.parents) exists { case (t1, t2) => !sameType(t1, t2) })
+ cs += Changed(toEntity(toSym))(from.parents.zip(to.parents).toString)
+ if (!sameTypeParams(from.typeParams, to.typeParams))
+ cs += Changed(toEntity(toSym))(" tparams: " + from.typeParams.zip(to.typeParams))
// new members not yet visited
val newMembers = mutable.HashSet[Symbol]()
- newMembers ++= to.info.decls.iterator filter omitSymbols
+ newMembers ++= to.decls.iterator filter omitSymbols
- for (o <- from.info.decls.iterator filter omitSymbols) {
- val n = to.info.decl(o.name)
+ for (o <- from.decls.iterator filter omitSymbols) {
+ val n = to.decl(o.name)
newMembers -= n
if (o.isClass)
- cs ++= changeSet(o, n)
+ cs ++= changeSet(o.info, n)
else if (n == NoSymbol)
cs += Removed(toEntity(o))
else {
- val newSym = n.suchThat(ov => sameType(ov.tpe, o.tpe))
+ val newSym =
+ o match {
+ case _:TypeSymbol if o.isAliasType =>
+ n.suchThat(ov => sameType(ov.info, o.info))
+ case _ =>
+ n.suchThat(ov => sameType(ov.tpe, o.tpe))
+ }
if (newSym == NoSymbol || moreRestrictive(o.flags, newSym.flags))
cs += Changed(toEntity(o))(n + " changed from " + o.tpe + " to " + n.tpe + " flags: " + Flags.flagsToString(o.flags))
else
diff --git a/src/compiler/scala/tools/nsc/doc/DocFactory.scala b/src/compiler/scala/tools/nsc/doc/DocFactory.scala
index 99cec01949..b70d8c10ec 100644
--- a/src/compiler/scala/tools/nsc/doc/DocFactory.scala
+++ b/src/compiler/scala/tools/nsc/doc/DocFactory.scala
@@ -12,10 +12,10 @@ import reporters.Reporter
* * A simplified compiler instance (with only the front-end phases enabled) is created, and additional
* ''sourceless'' comments are registered.
* * Documentable files are compiled, thereby filling the compiler's symbol table.
- * * A documentation model is extracted from the post-compilation compiler's symbol table.
+ * * A documentation model is extracted from the post-compilation symbol table.
* * A generator is used to transform the model into the correct final format (HTML).
*
- * A processor contains a single compiler instantiated from the processor's settings. Each call to the `run` method
+ * A processor contains a single compiler instantiated from the processor's `settings`. Each call to `document`
* uses the same compiler instance with the same symbol table. In particular, this implies that the scaladoc site
* obtained from a call to `run` will contain documentation about files compiled during previous calls to the same
* processor's `run` method.
@@ -50,6 +50,7 @@ class DocFactory(val reporter: Reporter, val settings: doc.Settings) { processor
def document(files: List[String]): Unit = {
(new compiler.Run()) compile files
compiler.addSourceless
+ assert(settings.docformat.value == "html")
if (!reporter.hasErrors) {
val modelFactory = (new model.ModelFactory(compiler, settings))
val htmlFactory = (new html.HtmlFactory(reporter, settings))
diff --git a/src/compiler/scala/tools/nsc/doc/DocProvider.scala b/src/compiler/scala/tools/nsc/doc/DocProvider.scala
new file mode 100644
index 0000000000..bcf227ebb9
--- /dev/null
+++ b/src/compiler/scala/tools/nsc/doc/DocProvider.scala
@@ -0,0 +1,3 @@
+package scala.tools.nsc.doc
+
+class DocProvider \ No newline at end of file
diff --git a/src/compiler/scala/tools/nsc/doc/SourcelessComments.scala b/src/compiler/scala/tools/nsc/doc/SourcelessComments.scala
index 9216fa6f23..0791c6fa51 100644
--- a/src/compiler/scala/tools/nsc/doc/SourcelessComments.scala
+++ b/src/compiler/scala/tools/nsc/doc/SourcelessComments.scala
@@ -6,12 +6,11 @@ package doc
import scala.collection._
/**
- * This class contains comments to all symbols which pre-exist in Scala, such as Any, Nothing, ...
- * It also contains a HashSet of the given symbols
- * The comments are to be added to a HashMap called comments, which resides in the Global.scala file
- * @author Manohar Jonnalagedda, Stephane Micheloud, Sean McDirmid, Geoffrey Washburn
- * @version 1.0
- */
+ * A class that provides comments for all symbols which pre-exist in Scala (Any, Nothing, ...)
+ * It also contains a HashSet of the given symbols
+ * The comments are to be added to a HashMap called comments, which resides in the Global.scala file
+ * @author Manohar Jonnalagedda, Stephane Micheloud, Sean McDirmid, Geoffrey Washburn
+ * @version 1.0 */
abstract class SourcelessComments {
val global: Global
diff --git a/src/compiler/scala/tools/nsc/doc/html/page/Template.scala b/src/compiler/scala/tools/nsc/doc/html/page/Template.scala
index 4ffdba4603..119823ff13 100644
--- a/src/compiler/scala/tools/nsc/doc/html/page/Template.scala
+++ b/src/compiler/scala/tools/nsc/doc/html/page/Template.scala
@@ -100,15 +100,22 @@ class Template(tpl: DocTemplateEntity) extends HtmlPage {
</li>
}
- def memberToCommentHtml(mbr: MemberEntity, isSelf: Boolean): NodeSeq = mbr match {
- case dte: DocTemplateEntity if isSelf =>
- <div id="comment" class="fullcomment">{ memberToFullCommentHtml(mbr, isSelf) }</div>
- case dte: DocTemplateEntity if mbr.comment.isDefined =>
- <p class="comment cmt">{ inlineToHtml(mbr.comment.get.short) }</p>
- case _ if mbr.comment.isDefined =>
- <p class="shortcomment cmt">{ inlineToHtml(mbr.comment.get.short) }</p>
- <div class="fullcomment">{ memberToFullCommentHtml(mbr, isSelf) }</div>
- case _ => NodeSeq.Empty
+ def memberToCommentHtml(mbr: MemberEntity, isSelf: Boolean): NodeSeq = {
+ val useCaseCommentHtml = mbr match {
+ case nte: NonTemplateMemberEntity if nte.isUseCase =>
+ inlineToHtml(comment.Text("[use case] "))
+ case _ => NodeSeq.Empty
+ }
+ mbr match {
+ case dte: DocTemplateEntity if isSelf =>
+ <div id="comment" class="fullcomment">{ memberToFullCommentHtml(mbr, isSelf) }</div>
+ case dte: DocTemplateEntity if mbr.comment.isDefined =>
+ <p class="comment cmt">{ inlineToHtml(mbr.comment.get.short) }</p>
+ case _ if mbr.comment.isDefined =>
+ <p class="shortcomment cmt">{ useCaseCommentHtml }{ inlineToHtml(mbr.comment.get.short) }</p>
+ <div class="fullcomment">{ useCaseCommentHtml }{ memberToFullCommentHtml(mbr, isSelf) }</div>
+ case _ => useCaseCommentHtml
+ }
}
def memberToFullCommentHtml(mbr: MemberEntity, isSelf: Boolean): NodeSeq =
@@ -178,6 +185,15 @@ class Template(tpl: DocTemplateEntity) extends HtmlPage {
case _ => NodeSeq.Empty
}
}
+ { tpl.companion match {
+ case Some(companion) =>
+ <div class="block">
+ Go to: <a href={relativeLinkTo(companion)}>companion</a>
+ </div>
+ case None =>
+ NodeSeq.Empty
+ }
+ }
</xml:group>
def kindToString(mbr: MemberEntity): String = mbr match {
diff --git a/src/compiler/scala/tools/nsc/doc/model/Entity.scala b/src/compiler/scala/tools/nsc/doc/model/Entity.scala
index 3e61b7f4ee..d7ef2b866d 100644
--- a/src/compiler/scala/tools/nsc/doc/model/Entity.scala
+++ b/src/compiler/scala/tools/nsc/doc/model/Entity.scala
@@ -70,6 +70,7 @@ trait DocTemplateEntity extends TemplateEntity with MemberEntity {
def values: List[Val]
def abstractTypes: List[AbstractType]
def aliasTypes: List[AliasType]
+ def companion: Option[DocTemplateEntity]
}
/** A ''documentable'' trait. */
@@ -94,7 +95,9 @@ trait Package extends Object {
def packages: List[Package]
}
-trait NonTemplateMemberEntity extends MemberEntity
+trait NonTemplateMemberEntity extends MemberEntity {
+ def isUseCase: Boolean
+}
/** A method (`def`) of a ''documentable'' class, trait or object. */
trait Def extends NonTemplateMemberEntity {
diff --git a/src/compiler/scala/tools/nsc/doc/model/ModelFactory.scala b/src/compiler/scala/tools/nsc/doc/model/ModelFactory.scala
index 0817cec4e2..f935dd4478 100644
--- a/src/compiler/scala/tools/nsc/doc/model/ModelFactory.scala
+++ b/src/compiler/scala/tools/nsc/doc/model/ModelFactory.scala
@@ -7,7 +7,9 @@ package model
import comment._
import scala.collection._
+
import symtab.Flags
+import util.Position
/** This trait extracts all required information for documentation from compilation units */
class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =>
@@ -22,21 +24,45 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
def makeModel: Package =
makePackage(RootPackage, null) getOrElse { throw new Error("no documentable class found in compilation units") }
- /** */
- protected val commentFactory = new CommentFactory(reporter)
+ object commentator {
- /** */
- protected val commentCache = mutable.Map.empty[Symbol, comment.Comment]
+ private val factory = new CommentFactory(reporter)
+
+ private val commentCache = mutable.HashMap.empty[(Symbol, TemplateImpl), Comment]
+
+ def registeredUseCase(sym: Symbol, inTpl: => TemplateImpl, docStr: String, docPos: Position): Symbol = {
+ commentCache += (sym, inTpl) -> factory.parse(docStr, docPos)
+ sym
+ }
+
+ def comment(sym: Symbol, inTpl: => DocTemplateImpl): Option[Comment] = {
+ val key = (sym, inTpl)
+ if (commentCache isDefinedAt key)
+ Some(commentCache(key))
+ else { // not reached for use-case comments
+ val rawComment = expandedDocComment(sym, inTpl.sym)
+ if (rawComment == "") None else {
+ val c = factory.parse(rawComment, docCommentPos(sym))
+ commentCache += (sym, inTpl) -> c
+ Some(c)
+ }
+ }
+ }
+
+ }
/** */
protected val templatesCache =
new mutable.LinkedHashMap[(Symbol, TemplateImpl), DocTemplateImpl]
+ def optimize(str: String): String =
+ if (str.length < 16) str.intern else str
+
/* ============== IMPLEMENTATION PROVIDING ENTITY TYPES ============== */
/** Provides a default implementation for instances of the `Entity` type. */
abstract class EntityImpl(val sym: Symbol, inTpl: => TemplateImpl) extends Entity {
- val name = sym.nameString
+ val name = optimize(sym.nameString)
def inTemplate = inTpl
def toRoot: List[EntityImpl] = this :: inTpl.toRoot
def qualifiedName = name
@@ -45,12 +71,12 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
/** Provides a default implementation for instances of the `WeakTemplateEntity` type. It must be instantiated as a
* `SymbolicEntity` to access the compiler symbol that underlies the entity. */
trait TemplateImpl extends EntityImpl with TemplateEntity {
- override def qualifiedName = if (inTemplate.isRootPackage) name else (inTemplate.qualifiedName + "." + name)
- val isPackage = sym.isPackage
- val isTrait = sym.isTrait
- val isClass = sym.isClass && !sym.isTrait
- val isObject = sym.isModule && !sym.isPackage
- val isRootPackage = false
+ override def qualifiedName = if (inTemplate.isRootPackage) name else optimize(inTemplate.qualifiedName + "." + name)
+ def isPackage = sym.isPackage
+ def isTrait = sym.isTrait
+ def isClass = sym.isClass && !sym.isTrait
+ def isObject = sym.isModule && !sym.isPackage
+ def isRootPackage = false
}
/** Provides a default implementation for instances of the `WeakTemplateEntity` type. It must be instantiated as a
@@ -63,14 +89,7 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
* `SymbolicEntity` to access the compiler symbol that underlies the entity. */
abstract class MemberImpl(sym: Symbol, inTpl: => DocTemplateImpl) extends EntityImpl(sym, inTpl) with MemberEntity {
val comment =
- if (inTpl == null) None else {
- val rawComment = expandedDocComment(sym, inTpl.sym)
- if (rawComment == "") None else {
- val c = commentFactory.parse(rawComment, docCommentPos(sym))
- commentCache += sym -> c
- Some(c)
- }
- }
+ if (inTpl == null) None else commentator.comment(sym, inTpl)
override def inTemplate = inTpl
override def toRoot: List[MemberImpl] = this :: inTpl.toRoot
def inDefinitionTemplates =
@@ -80,7 +99,7 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
inTpl :: Nil
else
makeTemplate(sym.owner) :: (sym.allOverriddenSymbols map { inhSym => makeTemplate(inhSym.owner) })
- val visibility = {
+ def visibility = {
def qual = {
val qq =
if (sym hasFlag Flags.LOCAL)
@@ -90,11 +109,11 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
else None
qq match { case Some(q) => "[" + q + "]" case None => "" }
}
- if (sym hasFlag Flags.PRIVATE) Some(Paragraph(Text("private" + qual)))
- else if (sym hasFlag Flags.PROTECTED) Some(Paragraph(Text("protected" + qual)))
+ if (sym hasFlag Flags.PRIVATE) Some(Paragraph(Text(optimize("private" + qual))))
+ else if (sym hasFlag Flags.PROTECTED) Some(Paragraph(Text(optimize("protected" + qual))))
else None
}
- val flags = {
+ def flags = {
val fgs = mutable.ListBuffer.empty[Paragraph]
if (sym hasFlag Flags.IMPLICIT) fgs += Paragraph(Text("implicit"))
if (sym hasFlag Flags.SEALED) fgs += Paragraph(Text("sealed"))
@@ -103,18 +122,18 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
if (!sym.isModule && (sym hasFlag Flags.FINAL)) fgs += Paragraph(Text("final"))
fgs.toList
}
- lazy val inheritedFrom =
+ def inheritedFrom =
if (inTemplate.sym == this.sym.owner || inTemplate.sym.isPackage) Nil else
makeTemplate(this.sym.owner) :: (sym.allOverriddenSymbols map { os => makeTemplate(os.owner) })
- val isDeprecated = sym.isDeprecated
- lazy val resultType = makeType(sym.tpe.finalResultType, inTemplate, sym)
- val isDef = false
- val isVal = false
- val isVar = false
- val isConstructor = false
- val isAliasType = false
- val isAbstractType = false
- val isTemplate = false
+ def isDeprecated = sym.isDeprecated
+ def resultType = makeType(sym.tpe.finalResultType, inTemplate, sym)
+ def isDef = false
+ def isVal = false
+ def isVar = false
+ def isConstructor = false
+ def isAliasType = false
+ def isAbstractType = false
+ def isTemplate = false
}
/** Provides a default implementation for instances of the `TemplateEntity` type. It must be instantiated as a
@@ -128,11 +147,11 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
abstract class DocTemplateImpl(sym: Symbol, inTpl: => DocTemplateImpl) extends MemberImpl(sym, inTpl) with TemplateImpl with DocTemplateEntity {
//if (inTpl != null) println("mbr " + sym + " in " + (inTpl.toRoot map (_.sym)).mkString(" > "))
templatesCache += ((sym, inTpl) -> this)
- override def definitionName = inDefinitionTemplates.head.qualifiedName + "." + name
+ override def definitionName = optimize(inDefinitionTemplates.head.qualifiedName + "." + name)
override def toRoot: List[DocTemplateImpl] = this :: inTpl.toRoot
- val inSource = if (sym.sourceFile != null) Some(sym.sourceFile, sym.pos.line) else None
- val typeParams = if (sym.isClass) sym.typeParams map (makeTypeParam(_, this)) else Nil
- val parentType =
+ def inSource = if (sym.sourceFile != null) Some(sym.sourceFile, sym.pos.line) else None
+ def typeParams = if (sym.isClass) sym.typeParams map (makeTypeParam(_, this)) else Nil
+ def parentType =
if (sym.isPackage) None else
Some(makeType(RefinedType(sym.tpe.parents filter (_ != ScalaObjectClass.tpe), EmptyScope)))
val linearization = {
@@ -150,15 +169,19 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
subClassesCache += sc
}
def subClasses = subClassesCache.toList
- def memberSyms = sym.info.nonPrivateMembers
+ protected def memberSyms = sym.info.nonPrivateMembers
val members: List[MemberEntity] = memberSyms flatMap (makeMember(_, this))
val templates = members partialMap { case c: DocTemplateEntity => c }
val methods = members partialMap { case d: Def => d }
val values = members partialMap { case v: Val => v }
val abstractTypes = members partialMap { case t: AbstractType => t }
val aliasTypes = members partialMap { case t: AliasType => t }
- override val isTemplate = true
+ override def isTemplate = true
def isDocTemplate = true
+ def companion = sym.linkedSym match {
+ case NoSymbol => None
+ case comSym => Some(makeDocTemplate(comSym, inTpl))
+ }
}
abstract class PackageImpl(sym: Symbol, inTpl: => PackageImpl) extends DocTemplateImpl(sym, inTpl) with Package {
@@ -168,8 +191,8 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
}
abstract class NonTemplateMemberImpl(sym: Symbol, inTpl: => DocTemplateImpl) extends MemberImpl(sym, inTpl) with NonTemplateMemberEntity {
- override def qualifiedName = inTemplate.qualifiedName + "#" + name
- override def definitionName = inDefinitionTemplates.head.qualifiedName + "#" + name
+ override def qualifiedName = optimize(inTemplate.qualifiedName + "#" + name)
+ override def definitionName = optimize(inDefinitionTemplates.head.qualifiedName + "#" + name)
}
abstract class ParameterImpl(sym: Symbol, inTpl: => DocTemplateImpl) extends EntityImpl(sym, inTpl) with ParameterEntity {
@@ -204,9 +227,9 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
override def inTemplate = this
override def toRoot = this :: Nil
override def qualifiedName = "_root_"
- override lazy val inheritedFrom = Nil
- override val isRootPackage = true
- override def memberSyms =
+ override def inheritedFrom = Nil
+ override def isRootPackage = true
+ override protected def memberSyms =
(bSym.info.members ++ EmptyPackage.info.members) filter { s =>
s != EmptyPackage && s != RootPackage
}
@@ -262,95 +285,109 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
new DocTemplateImpl(bSym, firstInTpl) with Object
else if (bSym.isTrait || (bSym.isAliasType && bSym.tpe.typeSymbol.isTrait))
new DocTemplateImpl(bSym, firstInTpl) with Trait {
- val valueParams =
+ def valueParams =
List(sym.constrParamAccessors map (makeValueParam(_, this)))
}
else if (bSym.isClass || (bSym.isAliasType && bSym.tpe.typeSymbol.isClass))
new DocTemplateImpl(bSym, firstInTpl) with Class {
- val valueParams =
+ def valueParams =
List(sym.constrParamAccessors map (makeValueParam(_, this)))
val constructors =
members partialMap { case d: Constructor => d }
- val primaryConstructor = (constructors find (_.isPrimary))
- val isCaseClass = sym.isClass && sym.hasFlag(Flags.CASE)
+ def primaryConstructor = (constructors find (_.isPrimary))
+ def isCaseClass = sym.isClass && sym.hasFlag(Flags.CASE)
}
else
throw new Error("'" + bSym + "' that isn't a class, trait or object cannot be built as a documentable template")
}
/** */
- def makeMember(aSym: Symbol, inTpl: => DocTemplateImpl): Option[MemberImpl] = {
+ def makeMember(aSym: Symbol, inTpl: => DocTemplateImpl): List[MemberImpl] = {
+ def makeMember0(bSym: Symbol): Option[MemberImpl] = {
+ if (bSym.isGetter && (bSym.accessed hasFlag Flags.MUTABLE))
+ Some(new NonTemplateMemberImpl(bSym, inTpl) with Val {
+ override def isVar = true
+ def isUseCase = bSym hasFlag Flags.SYNTHETIC
+ })
+ else if (bSym.isMethod && !(bSym hasFlag Flags.ACCESSOR) && !bSym.isConstructor && !(bSym hasFlag Flags.FINAL))
+ Some(new NonTemplateMemberImpl(bSym, inTpl) with Def {
+ override def isDef = true
+ def isUseCase = bSym hasFlag Flags.SYNTHETIC
+ def typeParams =
+ sym.tpe.typeParams map (makeTypeParam(_, inTpl))
+ def valueParams =
+ sym.paramss map { ps => (ps.zipWithIndex) map { case (p, i) =>
+ if (p.nameString contains "$") makeValueParam(p, inTpl, optimize("arg" + i)) else makeValueParam(p, inTpl)
+ }}
+ })
+ else if (bSym.isConstructor)
+ Some(new NonTemplateMemberImpl(bSym, inTpl) with Constructor {
+ override def isConstructor = true
+ def isUseCase = bSym hasFlag Flags.SYNTHETIC
+ def isPrimary = sym.isPrimaryConstructor
+ def valueParams =
+ sym.paramss map { ps => (ps.zipWithIndex) map { case (p, i) =>
+ if (p.nameString contains "$") makeValueParam(p, inTpl, optimize("arg" + i)) else makeValueParam(p, inTpl)
+ }}
+ })
+ else if (bSym.isGetter) // Scala field accessor or Java field
+ Some(new NonTemplateMemberImpl(bSym, inTpl) with Val {
+ override def isVal = true
+ def isUseCase = bSym hasFlag Flags.SYNTHETIC
+ })
+ else if (bSym.isAbstractType)
+ Some(new NonTemplateMemberImpl(bSym, inTpl) with AbstractType {
+ override def isAbstractType = true
+ def isUseCase = bSym hasFlag Flags.SYNTHETIC
+ def lo = sym.info.normalize match {
+ case TypeBounds(lo, hi) if lo.typeSymbol != definitions.NothingClass => Some(makeType(lo, inTpl, sym))
+ case _ => None
+ }
+ def hi = sym.info.normalize match {
+ case TypeBounds(lo, hi) if hi.typeSymbol != definitions.AnyClass => Some(makeType(hi, inTpl, sym))
+ case _ => None
+ }
+ })
+ else if (bSym.isAliasType)
+ Some(new NonTemplateMemberImpl(bSym, inTpl) with AliasType {
+ override def isAliasType = true
+ def isUseCase = bSym hasFlag Flags.SYNTHETIC
+ def alias = makeType(sym.tpe, inTpl, sym)
+ })
+ else if (bSym.isPackage)
+ inTpl match { case inPkg: PackageImpl => makePackage(bSym, inPkg) }
+ else if ((bSym.isClass || bSym.isModule) && (bSym.sourceFile != null) && bSym.isPublic && !bSym.isLocal) {
+ (inTpl.toRoot find (_.sym == bSym )) orElse Some(makeDocTemplate(bSym, inTpl))
+ }
+ else
+ None
+ }
if (!aSym.isPublic || (aSym hasFlag Flags.SYNTHETIC) || (aSym hasFlag Flags.BRIDGE) || aSym.isLocal || aSym.isModuleClass || aSym.isPackageObject || aSym.isMixinConstructor)
- None
- else if (aSym.isGetter && (aSym.accessed hasFlag Flags.MUTABLE))
- Some(new NonTemplateMemberImpl(aSym, inTpl) with Val {
- override val isVar = true
- })
- else if (aSym.isMethod && !(aSym hasFlag Flags.ACCESSOR) && !aSym.isConstructor && !(aSym hasFlag Flags.FINAL))
- Some(new NonTemplateMemberImpl(aSym, inTpl) with Def {
- override val isDef = true
- val typeParams =
- sym.tpe.typeParams map (makeTypeParam(_, inTpl))
- val valueParams =
- sym.paramss map { ps => (ps.zipWithIndex) map { case (p, i) =>
- if (p.nameString contains "$") makeValueParam(p, inTpl, "arg" + i) else makeValueParam(p, inTpl)
- }}
- })
- else if (aSym.isConstructor)
- Some(new NonTemplateMemberImpl(aSym, inTpl) with Constructor {
- override val isConstructor = true
- val isPrimary = sym.isPrimaryConstructor
- val valueParams =
- sym.paramss map { ps => (ps.zipWithIndex) map { case (p, i) =>
- if (p.nameString contains "$") makeValueParam(p, inTpl, "arg" + i) else makeValueParam(p, inTpl)
- }}
- })
- else if (aSym.isGetter) // Scala field accessor or Java field
- Some(new NonTemplateMemberImpl(aSym, inTpl) with Val {
- override val isVal = true
- })
- else if (aSym.isAbstractType)
- Some(new NonTemplateMemberImpl(aSym, inTpl) with AbstractType {
- override val isAbstractType = true
- val lo = sym.info.normalize match {
- case TypeBounds(lo, hi) if lo.typeSymbol != definitions.NothingClass => Some(makeType(lo, inTpl, sym))
- case _ => None
- }
- val hi = sym.info.normalize match {
- case TypeBounds(lo, hi) if hi.typeSymbol != definitions.AnyClass => Some(makeType(hi, inTpl, sym))
- case _ => None
- }
- })
- else if (aSym.isAliasType)
- Some(new NonTemplateMemberImpl(aSym, inTpl) with AliasType {
- override val isAliasType = true
- val alias = makeType(sym.tpe, inTpl, sym)
- })
- else if (aSym.isPackage)
- inTpl match { case inPkg: PackageImpl => makePackage(aSym, inPkg) }
- else if ((aSym.isClass || aSym.isModule) && (aSym.sourceFile != null) && aSym.isPublic && !aSym.isLocal) {
- (inTpl.toRoot find (_.sym == aSym )) orElse Some(makeDocTemplate(aSym, inTpl))
+ Nil
+ else {
+ val allSyms = useCases(aSym, inTpl.sym) map { case (bSym, bComment, bPos) =>
+ commentator.registeredUseCase(bSym, inTpl, bComment, bPos)
+ }
+ (allSyms ::: List(aSym)) flatMap (makeMember0(_))
}
- else
- None
}
/** */
def makeTypeParam(aSym: Symbol, inTpl: => DocTemplateImpl): TypeParam = {
new ParameterImpl(aSym, inTpl) with TypeParam {
- val isTypeParam = true
- val isValueParam = false
- val variance: String = {
+ def isTypeParam = true
+ def isValueParam = false
+ def variance: String = {
if (sym hasFlag Flags.COVARIANT) "+"
else if (sym hasFlag Flags.CONTRAVARIANT) "-"
else ""
}
- val lo = sym.info.normalize match {
+ def lo = sym.info.normalize match {
case TypeBounds(lo, hi) if lo.typeSymbol != definitions.NothingClass =>
Some(makeType(lo, inTpl, sym))
case _ => None
}
- val hi = sym.info.normalize match {
+ def hi = sym.info.normalize match {
case TypeBounds(lo, hi) if hi.typeSymbol != definitions.AnyClass =>
Some(makeType(hi, inTpl, sym))
case _ => None
@@ -366,12 +403,12 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
/** */
def makeValueParam(aSym: Symbol, inTpl: => DocTemplateImpl, newName: String): ValueParam = {
new ParameterImpl(aSym, inTpl) with ValueParam {
- val isTypeParam = false
- val isValueParam = true
- val resultType = {
+ override val name = newName
+ def isTypeParam = false
+ def isValueParam = true
+ def resultType = {
makeType(sym.tpe, inTpl, sym)
}
- override val name = newName
}
}
@@ -449,7 +486,7 @@ class ModelFactory(val global: Global, val settings: doc.Settings) { extractor =
}
appendType0(aType)
val refEntity = refBuffer
- val name = nameBuffer.toString
+ val name = optimize(nameBuffer.toString)
}
}
diff --git a/src/compiler/scala/tools/nsc/doc/model/comment/CommentFactory.scala b/src/compiler/scala/tools/nsc/doc/model/comment/CommentFactory.scala
index d51573364f..4504a97af5 100644
--- a/src/compiler/scala/tools/nsc/doc/model/comment/CommentFactory.scala
+++ b/src/compiler/scala/tools/nsc/doc/model/comment/CommentFactory.scala
@@ -495,7 +495,7 @@ final class CommentFactory(val reporter: Reporter) { parser =>
final def getRead(): String = {
val bld = readBuilder.toString
readBuilder.clear()
- bld
+ if (bld.length < 6) bld.intern else bld
}
final def readUntil(ch: Char): Int = {
diff --git a/src/compiler/scala/tools/nsc/interactive/RefinedBuildManager.scala b/src/compiler/scala/tools/nsc/interactive/RefinedBuildManager.scala
index 5414b53e0c..9ca9a740df 100644
--- a/src/compiler/scala/tools/nsc/interactive/RefinedBuildManager.scala
+++ b/src/compiler/scala/tools/nsc/interactive/RefinedBuildManager.scala
@@ -40,15 +40,15 @@ class RefinedBuildManager(val settings: Settings) extends Changes with BuildMana
protected def newCompiler(settings: Settings) = new BuilderGlobal(settings)
val compiler = newCompiler(settings)
- import compiler.{Symbol, atPhase, currentRun}
+ import compiler.{Symbol, Type, atPhase, currentRun}
- private case class Symbols(sym: Symbol, symBefErasure: Symbol)
+ private case class SymWithHistory(sym: Symbol, befErasure: Type)
/** Managed source files. */
private val sources: mutable.Set[AbstractFile] = new mutable.HashSet[AbstractFile]
- private val definitions: mutable.Map[AbstractFile, List[Symbols]] =
- new mutable.HashMap[AbstractFile, List[Symbols]] {
+ private val definitions: mutable.Map[AbstractFile, List[SymWithHistory]] =
+ new mutable.HashMap[AbstractFile, List[SymWithHistory]] {
override def default(key: AbstractFile) = Nil
}
@@ -72,7 +72,7 @@ class RefinedBuildManager(val settings: Settings) extends Changes with BuildMana
*/
private def invalidatedByRemove(files: Set[AbstractFile]): Set[AbstractFile] = {
val changes = new mutable.HashMap[Symbol, List[Change]]
- for (f <- files; Symbols(sym, _) <- definitions(f))
+ for (f <- files; SymWithHistory(sym, _) <- definitions(f))
changes += sym -> List(Removed(Class(sym.fullNameString)))
invalidated(files, changes)
}
@@ -125,13 +125,12 @@ class RefinedBuildManager(val settings: Settings) extends Changes with BuildMana
definitions(src).find(
s => (s.sym.fullNameString == sym.fullNameString) &&
isCorrespondingSym(s.sym, sym)) match {
- case Some(Symbols(oldSym, oldSymEras)) =>
- val changes = changeSet(oldSym, sym)
+ case Some(SymWithHistory(oldSym, info)) =>
+ val changes = changeSet(oldSym.info, sym)
val changesErasure =
atPhase(currentRun.erasurePhase.prev) {
- changeSet(oldSymEras, sym)
+ changeSet(info, sym)
}
-
changesOf(oldSym) = (changes ++ changesErasure).removeDuplicates
case _ =>
// a new top level definition
@@ -142,7 +141,7 @@ class RefinedBuildManager(val settings: Settings) extends Changes with BuildMana
}
}
// Create a change for the top level classes that were removed
- val removed = definitions(src) filterNot ((s:Symbols) =>
+ val removed = definitions(src) filterNot ((s:SymWithHistory) =>
syms.find(_.fullNameString == (s.sym.fullNameString)) != None)
for (s <- removed) {
changesOf(s.sym) = List(removeChangeSet(s.sym))
@@ -279,7 +278,11 @@ class RefinedBuildManager(val settings: Settings) extends Changes with BuildMana
private def updateDefinitions(files: Set[AbstractFile]) {
for (src <- files; val localDefs = compiler.dependencyAnalysis.definitions(src)) {
definitions(src) = (localDefs map (s => {
- Symbols(s.cloneSymbol, atPhase(currentRun.erasurePhase.prev) {s.cloneSymbol})
+ SymWithHistory(
+ s.cloneSymbol,
+ atPhase(currentRun.erasurePhase.prev) {
+ s.info.cloneInfo(s)
+ })
}))
}
this.references = compiler.dependencyAnalysis.references
diff --git a/src/compiler/scala/tools/nsc/interpreter/Completion.scala b/src/compiler/scala/tools/nsc/interpreter/Completion.scala
index 2ecafa974a..2b9538b3fc 100644
--- a/src/compiler/scala/tools/nsc/interpreter/Completion.scala
+++ b/src/compiler/scala/tools/nsc/interpreter/Completion.scala
@@ -31,7 +31,12 @@ import scala.util.NameTransformer.{ decode, encode }
// REPL completor - queries supplied interpreter for valid completions
// based on current contents of buffer.
-class Completion(val interpreter: Interpreter) extends Completor {
+class Completion(
+ val interpreter: Interpreter,
+ val intLoop: InterpreterLoop)
+extends Completor {
+ def this(interpreter: Interpreter) = this(interpreter, null)
+
import Completion._
import java.util.{ List => JList }
import interpreter.compilerClasspath
@@ -59,7 +64,7 @@ class Completion(val interpreter: Interpreter) extends Completor {
}
// One instance of a command line
- class Buffer(s: String) {
+ class Buffer(s: String, verbose: Boolean) {
val buffer = if (s == null) "" else s
def isEmptyBuffer = buffer == ""
@@ -133,25 +138,29 @@ class Completion(val interpreter: Interpreter) extends Completor {
}
def membersOfPredef() = membersOfId("scala.Predef")
- def javaLangToHide(s: String) =
+ def javaLangToHide(s: String) = (
(s endsWith "Exception") ||
(s endsWith "Error") ||
(s endsWith "Impl") ||
(s startsWith "CharacterData") ||
!existsAndPublic("java.lang." + s)
+ )
def scalaToHide(s: String) =
(List("Tuple", "Product", "Function") exists (x => (x + """\d+""").r findPrefixMatchOf s isDefined)) ||
(List("Exception", "Error") exists (s endsWith _))
- def defaultMembers = (List("scala", "java.lang") flatMap membersOfPath) ::: membersOfPredef
+ /** Hide all default members not verbose */
+ def defaultMembers =
+ if (verbose) (List("scala", "java.lang") flatMap membersOfPath) ::: membersOfPredef
+ else Nil
def pkgsStartingWith(s: String) = topLevelPackages() filter (_ startsWith s)
def idsStartingWith(s: String) = {
- // on a totally empty buffer, filter out res*
+ // only print res* when verbose
val unqIds =
- if (s == "") interpreter.unqualifiedIds filterNot (_ startsWith INTERPRETER_VAR_PREFIX)
- else interpreter.unqualifiedIds
+ if (verbose) interpreter.unqualifiedIds
+ else interpreter.unqualifiedIds filterNot (_ startsWith INTERPRETER_VAR_PREFIX)
(unqIds ::: defaultMembers) filter (_ startsWith s)
}
@@ -175,9 +184,21 @@ class Completion(val interpreter: Interpreter) extends Completor {
(interpreter getClassObject ("scala." + path)) orElse
(interpreter getClassObject ("java.lang." + path))
+ def lastHistoryItem =
+ for (loop <- Option(intLoop) ; h <- loop.history) yield
+ h.getHistoryList.get(h.size - 1)
+
+ // Is the buffer the same it was last time they hit tab?
+ private var lastTab: (String, String) = (null, null)
+
// jline's completion comes through here - we ask a Buffer for the candidates.
- override def complete(_buffer: String, cursor: Int, candidates: JList[String]): Int =
- new Buffer(_buffer) complete candidates
+ override def complete(_buffer: String, cursor: Int, candidates: JList[String]): Int = {
+ // println("_buffer = %s, cursor = %d".format(_buffer, cursor))
+ val verbose = (_buffer, lastHistoryItem orNull) == lastTab
+ lastTab = (_buffer, lastHistoryItem orNull)
+
+ new Buffer(_buffer, verbose) complete candidates
+ }
def completePackageMembers(path: String): List[String] =
getClassObject(path + "." + "package") map (getMembers(_, false)) getOrElse Nil
diff --git a/src/compiler/scala/tools/nsc/interpreter/InteractiveReader.scala b/src/compiler/scala/tools/nsc/interpreter/InteractiveReader.scala
index f2eb30cf14..500876bf69 100644
--- a/src/compiler/scala/tools/nsc/interpreter/InteractiveReader.scala
+++ b/src/compiler/scala/tools/nsc/interpreter/InteractiveReader.scala
@@ -38,9 +38,9 @@ object InteractiveReader {
/** Create an interactive reader. Uses <code>JLineReader</code> if the
* library is available, but otherwise uses a <code>SimpleReader</code>.
*/
- def createDefault(interpreter: Interpreter): InteractiveReader =
+ def createDefault(interpreter: Interpreter, intLoop: InterpreterLoop = null): InteractiveReader =
catching(exes: _*)
- . opt (new JLineReader(interpreter))
+ . opt (new JLineReader(interpreter, intLoop))
. getOrElse (new SimpleReader)
}
diff --git a/src/compiler/scala/tools/nsc/interpreter/JLineReader.scala b/src/compiler/scala/tools/nsc/interpreter/JLineReader.scala
index 59d6f0ac0a..b13b54a716 100644
--- a/src/compiler/scala/tools/nsc/interpreter/JLineReader.scala
+++ b/src/compiler/scala/tools/nsc/interpreter/JLineReader.scala
@@ -11,15 +11,17 @@ import java.io.File
import jline.{ History, ConsoleReader, ArgumentCompletor }
/** Reads from the console using JLine */
-class JLineReader(interpreter: Interpreter) extends InteractiveReader {
- def this() = this(null)
+class JLineReader(interpreter: Interpreter, intLoop: InterpreterLoop) extends InteractiveReader {
+ def this() = this(null, null)
+ def this(interpreter: Interpreter) = this(interpreter, null)
+ def history: History = consoleReader.getHistory
+
val consoleReader = {
- val history = try {
- new jline.History(new File(System.getProperty("user.home"), ".scala_history"))
- } catch {
+ val history =
+ try new History(new File(System.getProperty("user.home"), ".scala_history"))
// do not store history if error
- case _ => new jline.History()
- }
+ catch { case _: Exception => new History() }
+
val r = new jline.ConsoleReader()
r setHistory history
r setBellEnabled false
@@ -30,7 +32,7 @@ class JLineReader(interpreter: Interpreter) extends InteractiveReader {
val delimChars = "(){}[],`;'\" \t".toArray
def isDelimiterChar(s: String, pos: Int) = delimChars contains s.charAt(pos)
}
- val comp = new ArgumentCompletor(new Completion(interpreter), delims)
+ val comp = new ArgumentCompletor(new Completion(interpreter, intLoop), delims)
comp setStrict false
r addCompletor comp
// XXX make this use a setting
diff --git a/src/compiler/scala/tools/nsc/interpreter/SimpleReader.scala b/src/compiler/scala/tools/nsc/interpreter/SimpleReader.scala
index 112b3e1e82..bca2e18e39 100644
--- a/src/compiler/scala/tools/nsc/interpreter/SimpleReader.scala
+++ b/src/compiler/scala/tools/nsc/interpreter/SimpleReader.scala
@@ -7,7 +7,7 @@
package scala.tools.nsc
package interpreter
-import java.io.{BufferedReader, PrintWriter}
+import java.io.{ BufferedReader, PrintWriter }
/** Reads using standard JDK API */
class SimpleReader(
diff --git a/src/compiler/scala/tools/nsc/matching/MatrixAdditions.scala b/src/compiler/scala/tools/nsc/matching/MatrixAdditions.scala
index c5f09cd8de..4f13d4fd99 100644
--- a/src/compiler/scala/tools/nsc/matching/MatrixAdditions.scala
+++ b/src/compiler/scala/tools/nsc/matching/MatrixAdditions.scala
@@ -115,7 +115,7 @@ trait MatrixAdditions extends ast.TreeDSL
object lxtt extends Transformer {
override def transform(tree: Tree): Tree = tree match {
case blck @ Block(vdefs, ld @ LabelDef(name, params, body)) =>
- def shouldInline(t: FinalState) = t.isReachedOnce && (t.label eq ld.symbol)
+ def shouldInline(t: FinalState) = t.isReachedOnce && (t.labelSym eq ld.symbol)
if (targets exists shouldInline) squeezedBlock(vdefs, body)
else blck
diff --git a/src/compiler/scala/tools/nsc/matching/ParallelMatching.scala b/src/compiler/scala/tools/nsc/matching/ParallelMatching.scala
index fd4b57ef67..7fce0ee73a 100644
--- a/src/compiler/scala/tools/nsc/matching/ParallelMatching.scala
+++ b/src/compiler/scala/tools/nsc/matching/ParallelMatching.scala
@@ -280,7 +280,9 @@ trait ParallelMatching extends ast.TreeDSL
lazy val pvgroup = PatternVarGroup.fromBindings(subst.get())
- final def tree(): Tree = squeezedBlock(pvgroup.valDefs, codegen)
+ final def tree(): Tree =
+ if (guard.isEmpty) success
+ else squeezedBlock(pvgroup.valDefs, codegen)
}
/** Mixture rule for all literal ints (and chars) i.e. hopefully a switch
diff --git a/src/compiler/scala/tools/nsc/symtab/Definitions.scala b/src/compiler/scala/tools/nsc/symtab/Definitions.scala
index 7a77095293..e1cf7a5a7e 100644
--- a/src/compiler/scala/tools/nsc/symtab/Definitions.scala
+++ b/src/compiler/scala/tools/nsc/symtab/Definitions.scala
@@ -19,7 +19,7 @@ trait Definitions {
// Working around bug #2133
private object definitionHelpers {
- def cond[T](x: T)(f: T =>? Boolean) = (f isDefinedAt x) && f(x)
+ def cond[T](x: T)(f: PartialFunction[T, Boolean]) = (f isDefinedAt x) && f(x)
}
import definitionHelpers._
diff --git a/src/compiler/scala/tools/nsc/symtab/Symbols.scala b/src/compiler/scala/tools/nsc/symtab/Symbols.scala
index 9f22bc54f7..5ee7409cc7 100644
--- a/src/compiler/scala/tools/nsc/symtab/Symbols.scala
+++ b/src/compiler/scala/tools/nsc/symtab/Symbols.scala
@@ -1238,7 +1238,7 @@ trait Symbols {
// appears to succeed but highly opaque errors come later: see bug #1286
if (res == false) {
val (f1, f2) = (this.sourceFile, that.sourceFile)
- if (f1 != null && f2 != null && f1 != f2)
+ if (f1 != null && f2 != null && f1.path != f2.path)
throw FatalError("Companions '" + this + "' and '" + that + "' must be defined in same file.")
}
@@ -1254,21 +1254,28 @@ trait Symbols {
else NoSymbol
}
+ /** A helper method that factors the common code used the discover a companion module of a class. If a companion
+ * module exists, its symbol is returned, otherwise, `NoSymbol` is returned. The method assumes that `this`
+ * symbol has already been checked to be a class (using `isClass`). */
+ private final def linkedModuleOfClass0: Symbol =
+ flatOwnerInfo.decl(name.toTermName).suchThat(
+ sym => (sym hasFlag MODULE) && (sym isCoDefinedWith this))
+
/** The module or case class factory with the same name in the same
* package as this class. A better name would be companionModuleOfClass.
*/
final def linkedModuleOfClass: Symbol =
- if (this.isClass && !this.isAnonymousClass && !this.isRefinementClass) {
- flatOwnerInfo.decl(name.toTermName).suchThat(
- sym => (sym hasFlag MODULE) && (sym isCoDefinedWith this))
- } else NoSymbol
+ if (this.isClass && !this.isAnonymousClass && !this.isRefinementClass)
+ linkedModuleOfClass0
+ else NoSymbol
/** For a module its linked class, for a class its linked module or case
* factory otherwise.
*/
final def linkedSym: Symbol =
if (isTerm) linkedClassOfModule
- else if (isClass) flatOwnerInfo.decl(name.toTermName).suchThat(_ isCoDefinedWith this)
+ else if (isClass)
+ linkedModuleOfClass0
else NoSymbol
/** For a module class its linked class, for a plain class
diff --git a/src/compiler/scala/tools/nsc/symtab/Types.scala b/src/compiler/scala/tools/nsc/symtab/Types.scala
index be537010f6..c9aab59ff5 100644
--- a/src/compiler/scala/tools/nsc/symtab/Types.scala
+++ b/src/compiler/scala/tools/nsc/symtab/Types.scala
@@ -98,7 +98,7 @@ trait Types {
/** Undo all changes to constraints to type variables upto `limit'
*/
private def undoTo(limit: UndoLog) {
- while (log ne limit) {
+ while ((log ne limit) && log.nonEmpty) {
val (tv, constr) = log.head
tv.constr = constr
log = log.tail
@@ -503,40 +503,20 @@ trait Types {
/** The info of `sym', seen as a member of this type.
*/
- def memberInfo(sym: Symbol): Type =
+ def memberInfo(sym: Symbol): Type = {
+ // incCounter(ctr1)
sym.info.asSeenFrom(this, sym.owner)
+ }
/** The type of `sym', seen as a member of this type. */
def memberType(sym: Symbol): Type = {
- trackTypeIDE(sym)
+ // incCounter(ctr2)
//@M don't prematurely instantiate higher-kinded types, they will be instantiated by transform, typedTypeApply, etc. when really necessary
sym.tpeHK match {
case ov @ OverloadedType(pre, alts) =>
OverloadedType(this, alts)
-/*
- val pre1 = pre match {
- case ClassInfoType(_, _, clazz) => clazz.tpe
- case _ => pre
- }
- if (this =:= pre1) ov
- else if (this =:= pre1.narrow) OverloadedType(this, alts)
- else {
- Console.println("bad memberType of overloaded symbol: "+this+"/"+pre1+"/"+pre1.narrow)
- assert(false)
- ov
- }
-*/
case tp =>
- val res = tp.asSeenFrom(this, sym.owner)
-/*
- if (sym.name.toString == "Elem") {
- println("pre = "+this)
- println("pre.normalize = "+this.widen.normalize)
- println("sym = "+sym+" in "+sym.ownerChain)
- println("result = "+res)
- }
-*/
- res
+ tp.asSeenFrom(this, sym.owner)
}
}
@@ -595,20 +575,21 @@ trait Types {
/** Is this type a subtype of that type? */
def <:<(that: Type): Boolean = {
if (util.Statistics.enabled) stat_<:<(that)
- else
+ else {
(this eq that) ||
(if (explainSwitch) explain("<:", isSubType, this, that)
else isSubType(this, that, AnyDepth))
+ }
}
def stat_<:<(that: Type): Boolean = {
incCounter(subtypeCount)
- val start = startTimer(subtypeNanos)
+// val start = startTimer(subtypeNanos)
val result =
(this eq that) ||
(if (explainSwitch) explain("<:", isSubType, this, that)
else isSubType(this, that, AnyDepth))
- stopTimer(subtypeNanos, start)
+// stopTimer(subtypeNanos, start)
result
}
@@ -616,12 +597,12 @@ trait Types {
*/
def weak_<:<(that: Type): Boolean = {
incCounter(subtypeCount)
- val start = startTimer(subtypeNanos)
+// val start = startTimer(subtypeNanos)
val result =
((this eq that) ||
(if (explainSwitch) explain("weak_<:", isWeakSubType, this, that)
else isWeakSubType(this, that)))
- stopTimer(subtypeNanos, start)
+// stopTimer(subtypeNanos, start)
result
}
@@ -796,8 +777,8 @@ trait Types {
var member: Symbol = NoSymbol
var excluded = excludedFlags | DEFERRED
var continue = true
- lazy val self: Type = this.narrow
- lazy val membertpe = self.memberType(member)
+ var self: Type = null
+ var membertpe: Type = null
while (continue) {
continue = false
val bcs0 = baseClasses
@@ -821,22 +802,25 @@ trait Types {
} else if (member == NoSymbol) {
member = sym
} else if (members eq null) {
-// val start = startTimer(timer1)
if (member.name != sym.name ||
!(member == sym ||
member.owner != sym.owner &&
- !sym.hasFlag(PRIVATE) &&
- (membertpe matches self.memberType(sym)))) {
+ !sym.hasFlag(PRIVATE) && {
+ if (self eq null) self = this.narrow
+ if (membertpe eq null) membertpe = self.memberType(member)
+ (membertpe matches self.memberType(sym))
+ })) {
members = new Scope(List(member, sym))
}
-// stopTimer(timer1, start)
} else {
var prevEntry = members.lookupEntry(sym.name)
while ((prevEntry ne null) &&
!(prevEntry.sym == sym ||
prevEntry.sym.owner != sym.owner &&
- !sym.hasFlag(PRIVATE) &&
- (self.memberType(prevEntry.sym) matches self.memberType(sym)))) {
+ !sym.hasFlag(PRIVATE) && {
+ if (self eq null) self = this.narrow
+ self.memberType(prevEntry.sym) matches self.memberType(sym)
+ })) {
prevEntry = members lookupNextEntry prevEntry
}
if (prevEntry eq null) {
@@ -1237,7 +1221,7 @@ trait Types {
def memo[A](op1: => A)(op2: Type => A) = intersectionWitness get parents match {
case Some(w) =>
if (w eq this) op1 else op2(w)
- case None =>
+ case none =>
intersectionWitness(parents) = this
op1
}
@@ -1288,7 +1272,7 @@ trait Types {
* If they are several higher-kinded parents with different bounds we need
* to take the intersection of their bounds
*/
- override def normalize =
+ override def normalize = {
if (isHigherKinded)
PolyType(
typeParams,
@@ -1300,6 +1284,7 @@ trait Types {
},
decls))
else super.normalize
+ }
/** A refined type P1 with ... with Pn { decls } is volatile if
* one of the parent types Pi is an abstract type, and
@@ -1380,7 +1365,7 @@ trait Types {
*/
private def getRefs(which: Int, from: Symbol): Set[Symbol] = refs(which) get from match {
case Some(set) => set
- case None => Set()
+ case none => Set()
}
/** Augment existing refs map with reference <pre>from -> to</pre>
@@ -1523,7 +1508,7 @@ trait Types {
sym.isAbstractType && bounds.hi.isVolatile
override val isTrivial: Boolean =
- pre.isTrivial && !sym.isTypeParameter && args.forall(_.isTrivial)
+ !sym.isTypeParameter && pre.isTrivial && args.forall(_.isTrivial)
override def isNotNull =
sym.isModuleClass || sym == NothingClass || isValueClass(sym) || super.isNotNull
@@ -1668,13 +1653,15 @@ A type's typeSymbol should never be inspected directly.
// TODO: no test case in the suite because don't know how to tell partest to compile in different runs,
// and in a specific order
private var normalizeTyparCount = -1
- override def normalize: Type =
+
+ override def normalize: Type = {
if (phase.erasedTypes) normalize0
else if (normalized == null || typeParamsDirect.length != normalizeTyparCount) {
normalizeTyparCount = typeParamsDirect.length
normalized = normalize0
normalized
} else normalized
+ }
override def decls: Scope = {
sym.info match {
@@ -2546,7 +2533,7 @@ A type's typeSymbol should never be inspected directly.
case TypeRef(_, sym, _) =>
occurCount get sym match {
case Some(count) => occurCount += (sym -> (count + 1))
- case None =>
+ case none =>
}
case _ =>
}
@@ -2745,13 +2732,18 @@ A type's typeSymbol should never be inspected directly.
/** Map this function over given type */
def mapOver(tp: Type): Type = tp match {
- case ErrorType => tp
- case WildcardType => tp
- case NoType => tp
- case NoPrefix => tp
+ case TypeRef(pre, sym, args) =>
+ val pre1 = this(pre)
+ //val args1 = args mapConserve this(_)
+ val args1 = if (args.isEmpty) args
+ else {
+ val tparams = sym.typeParams
+ if (tparams.isEmpty) args
+ else mapOverArgs(args, tparams)
+ }
+ if ((pre1 eq pre) && (args1 eq args)) tp
+ else typeRef(pre1, sym, args1)
case ThisType(_) => tp
- case ConstantType(_) => tp
- case DeBruijnIndex(_, _) => tp
case SingleType(pre, sym) =>
if (sym.isPackageClass) tp // short path
else {
@@ -2759,22 +2751,28 @@ A type's typeSymbol should never be inspected directly.
if (pre1 eq pre) tp
else singleType(pre1, sym)
}
+ case MethodType(params, result) =>
+ variance = -variance
+ val params1 = mapOver(params)
+ variance = -variance
+ val result1 = this(result)
+ if ((params1 eq params) && (result1 eq result)) tp
+ // for new dependent types: result1.substSym(params, params1)?
+ else copyMethodType(tp, params1, result1.substSym(params, params1))
+ case PolyType(tparams, result) =>
+ variance = -variance
+ val tparams1 = mapOver(tparams)
+ variance = -variance
+ var result1 = this(result)
+ if ((tparams1 eq tparams) && (result1 eq result)) tp
+ else PolyType(tparams1, result1.substSym(tparams, tparams1))
+ case ConstantType(_) => tp
+ case DeBruijnIndex(_, _) => tp
case SuperType(thistp, supertp) =>
val thistp1 = this(thistp)
val supertp1 = this(supertp)
if ((thistp1 eq thistp) && (supertp1 eq supertp)) tp
else mkSuperType(thistp1, supertp1)
- case TypeRef(pre, sym, args) =>
- val pre1 = this(pre)
- //val args1 = args mapConserve this(_)
- val args1 = if (args.isEmpty) args
- else {
- val tparams = sym.typeParams
- if (tparams.isEmpty) args
- else mapOverArgs(args, tparams)
- }
- if ((pre1 eq pre) && (args1 eq args)) tp
- else typeRef(pre1, sym, args1)
case TypeBounds(lo, hi) =>
variance = -variance
val lo1 = this(lo)
@@ -2792,28 +2790,6 @@ A type's typeSymbol should never be inspected directly.
//if ((parents1 eq parents) && (decls1 eq decls)) tp
//else refinementOfClass(tp.typeSymbol, parents1, decls1)
copyRefinedType(rtp, parents1, decls1)
-/*
- case ClassInfoType(parents, decls, clazz) =>
- val parents1 = parents mapConserve (this);
- val decls1 = mapOver(decls);
- if ((parents1 eq parents) && (decls1 eq decls)) tp
- else cloneDecls(ClassInfoType(parents1, new Scope(), clazz), tp, decls1)
-*/
- case MethodType(params, result) =>
- variance = -variance
- val params1 = mapOver(params)
- variance = -variance
- val result1 = this(result)
- if ((params1 eq params) && (result1 eq result)) tp
- // for new dependent types: result1.substSym(params, params1)?
- else copyMethodType(tp, params1, result1.substSym(params, params1))
- case PolyType(tparams, result) =>
- variance = -variance
- val tparams1 = mapOver(tparams)
- variance = -variance
- var result1 = this(result)
- if ((tparams1 eq tparams) && (result1 eq result)) tp
- else PolyType(tparams1, result1.substSym(tparams, tparams1))
case ExistentialType(tparams, result) =>
val tparams1 = mapOver(tparams)
var result1 = this(result)
@@ -2841,6 +2817,12 @@ A type's typeSymbol should never be inspected directly.
if ((annots1 eq annots) && (atp1 eq atp)) tp
else if (annots1.isEmpty) atp1
else AnnotatedType(annots1, atp1, selfsym)
+/*
+ case ErrorType => tp
+ case WildcardType => tp
+ case NoType => tp
+ case NoPrefix => tp
+*/
case _ =>
tp
// throw new Error("mapOver inapplicable for " + tp);
@@ -3055,13 +3037,13 @@ A type's typeSymbol should never be inspected directly.
def stabilize(pre: Type, clazz: Symbol): Type = {
capturedPre get clazz match {
- case None =>
+ case Some(qvar) =>
+ qvar
+ case _ =>
val qvar = clazz freshExistential ".type" setInfo singletonBounds(pre)
capturedPre += (clazz -> qvar)
capturedParams = qvar :: capturedParams
qvar
- case Some(qvar) =>
- qvar
}
}.tpe
@@ -3076,6 +3058,7 @@ A type's typeSymbol should never be inspected directly.
if (b == NoType && clazz.isRefinementClass) pre
else b
}
+
def apply(tp: Type): Type =
if ((pre eq NoType) || (pre eq NoPrefix) || !clazz.isClass) tp
else tp match {
@@ -3255,7 +3238,7 @@ A type's typeSymbol should never be inspected directly.
} else {
giveup()
}
- case None => super.transform(tree)
+ case none => super.transform(tree)
}
case tree => super.transform(tree)
}
@@ -3761,21 +3744,22 @@ A type's typeSymbol should never be inspected directly.
/** Do `tp1' and `tp2' denote equivalent types?
*/
- def isSameType(tp1: Type, tp2: Type): Boolean = try {
+ def isSameType(tp1: Type, tp2: Type): Boolean = { val start = startTimer(timer1); try {
incCounter(sametypeCount)
subsametypeRecursions += 1
undoLog undoUnless {
- isSameType0(tp1, tp2)
+ isSameType1(tp1, tp2)
}
} finally {
subsametypeRecursions -= 1
- if (subsametypeRecursions == 0) undoLog clear
- }
+ if (subsametypeRecursions == 0) undoLog.clear
+ stopTimer(timer1, start)
+ }}
def isDifferentType(tp1: Type, tp2: Type): Boolean = try {
subsametypeRecursions += 1
undoLog undo { // undo type constraints that arise from operations in this block
- !isSameType0(tp1, tp2)
+ !isSameType1(tp1, tp2)
}
} finally {
subsametypeRecursions -= 1
@@ -3806,7 +3790,8 @@ A type's typeSymbol should never be inspected directly.
}
*/
- private def isSameType0(tp1: Type, tp2: Type): Boolean =
+ private def isSameType0(tp1: Type, tp2: Type): Boolean = {
+ if (tp1 eq tp2) return true
((tp1, tp2) match {
case (ErrorType, _) => true
case (WildcardType, _) => true
@@ -3822,7 +3807,7 @@ A type's typeSymbol should never be inspected directly.
if (sym1 == sym2) =>
true
case (SingleType(pre1, sym1), SingleType(pre2, sym2))
- if equalSymsAndPrefixes(sym1, pre1, sym2, pre2) =>
+ if (equalSymsAndPrefixes(sym1, pre1, sym2, pre2)) =>
true
/*
case (SingleType(pre1, sym1), ThisType(sym2))
@@ -3908,6 +3893,162 @@ A type's typeSymbol should never be inspected directly.
val tp2n = normalizePlus(tp2)
((tp1n ne tp1) || (tp2n ne tp2)) && isSameType(tp1n, tp2n)
}
+ }
+
+ private def isSameType1(tp1: Type, tp2: Type): Boolean = {
+ if ((tp1 eq tp2) ||
+ (tp1 eq ErrorType) || (tp1 eq WildcardType) ||
+ (tp2 eq ErrorType) || (tp2 eq WildcardType))
+ true
+ else if ((tp1 eq NoType) || (tp2 eq NoType))
+ false
+ else if (tp1 eq NoPrefix)
+ tp2.typeSymbol.isPackageClass
+ else if (tp2 eq NoPrefix)
+ tp1.typeSymbol.isPackageClass
+ else {
+ isSameType2(tp1, tp2) || {
+ val tp1n = normalizePlus(tp1)
+ val tp2n = normalizePlus(tp2)
+ ((tp1n ne tp1) || (tp2n ne tp2)) && isSameType(tp1n, tp2n)
+ }
+ }
+ }
+
+ def isSameType2(tp1: Type, tp2: Type): Boolean = {
+ tp1 match {
+ case tr1: TypeRef =>
+ tp2 match {
+ case tr2: TypeRef =>
+ return equalSymsAndPrefixes(tr1.sym, tr1.pre, tr2.sym, tr2.pre) &&
+ ((tp1.isHigherKinded && tp2.isHigherKinded && tp1.normalize =:= tp2.normalize) ||
+ isSameTypes(tr1.args, tr2.args))
+ case _ =>
+ }
+ case ThisType(sym1) =>
+ tp2 match {
+ case ThisType(sym2) =>
+ if (sym1 == sym2) return true
+ case _ =>
+ }
+ case SingleType(pre1, sym1) =>
+ tp2 match {
+ case SingleType(pre2, sym2) =>
+ if (equalSymsAndPrefixes(sym1, pre1, sym2, pre2)) return true
+ case _ =>
+ }
+ case ConstantType(value1) =>
+ tp2 match {
+ case ConstantType(value2) =>
+ return (value1 == value2)
+ case _ =>
+ }
+ case RefinedType(parents1, ref1) =>
+ tp2 match {
+ case RefinedType(parents2, ref2) =>
+ def isSubScope(s1: Scope, s2: Scope): Boolean = s2.toList.forall {
+ sym2 =>
+ var e1 = s1.lookupEntry(sym2.name)
+ (e1 ne null) && {
+ val substSym = sym2.info.substThis(sym2.owner, e1.sym.owner.thisType)
+ var isEqual = false
+ while (!isEqual && (e1 ne null)) {
+ isEqual = e1.sym.info =:= substSym
+ e1 = s1.lookupNextEntry(e1)
+ }
+ isEqual
+ }
+ }
+ //Console.println("is same? " + tp1 + " " + tp2 + " " + tp1.typeSymbol.owner + " " + tp2.typeSymbol.owner)//DEBUG
+ return isSameTypes(parents1, parents2) &&
+ isSubScope(ref1, ref2) && isSubScope(ref2, ref1)
+ case _ =>
+ }
+ case MethodType(params1, res1) =>
+ tp2 match {
+ case MethodType(params2, res2) =>
+ // new dependent types: probably fix this, use substSym as done for PolyType
+ return isSameTypes(tp1.paramTypes, tp2.paramTypes) &&
+ res1 =:= res2 &&
+ tp1.isInstanceOf[ImplicitMethodType] == tp2.isInstanceOf[ImplicitMethodType]
+ case _ =>
+ }
+ case PolyType(tparams1, res1) =>
+ tp2 match {
+ case PolyType(tparams2, res2) =>
+// assert((tparams1 map (_.typeParams.length)) == (tparams2 map (_.typeParams.length)))
+ return tparams1.length == tparams2.length &&
+ (tparams1, tparams2).zipped.forall((p1, p2) =>
+ p1.info =:= p2.info.substSym(tparams2, tparams1)) && //@M looks like it might suffer from same problem as #2210
+ res1 =:= res2.substSym(tparams2, tparams1)
+ case _ =>
+ }
+ case ExistentialType(tparams1, res1) =>
+ tp2 match {
+ case ExistentialType(tparams2, res2) =>
+ return (tparams1.length == tparams2.length &&
+ (tparams1, tparams2).zipped.forall
+ ((p1, p2) => p1.info =:= p2.info.substSym(tparams2, tparams1)) && //@M looks like it might suffer from same problem as #2210
+ res1 =:= res2.substSym(tparams2, tparams1))
+ case _ =>
+ }
+ case TypeBounds(lo1, hi1) =>
+ tp2 match {
+ case TypeBounds(lo2, hi2) =>
+ return lo1 =:= lo2 && hi1 =:= hi2
+ case _ =>
+ }
+ case BoundedWildcardType(bounds) =>
+ return bounds containsType tp2
+ case _ =>
+ }
+ tp2 match {
+ case BoundedWildcardType(bounds) =>
+ return bounds containsType tp1
+ case _ =>
+ }
+ tp1 match {
+ case tv @ TypeVar(_,_) =>
+ return tv.registerTypeEquality(tp2, true)
+ case _ =>
+ }
+ tp2 match {
+ case tv @ TypeVar(_,_) =>
+ return tv.registerTypeEquality(tp1, false)
+ case _ =>
+ }
+ tp1 match {
+ case AnnotatedType(_,_,_) =>
+ return annotationsConform(tp1, tp2) && annotationsConform(tp2, tp1) && tp1.withoutAnnotations =:= tp2.withoutAnnotations
+ case _ =>
+ }
+ tp2 match {
+ case AnnotatedType(_,_,_) =>
+ return annotationsConform(tp1, tp2) && annotationsConform(tp2, tp1) && tp1.withoutAnnotations =:= tp2.withoutAnnotations
+ case _ =>
+ }
+ tp1 match {
+ case _: SingletonType =>
+ tp2 match {
+ case _: SingletonType =>
+ var origin1 = tp1
+ while (origin1.underlying.isInstanceOf[SingletonType]) {
+ assert(origin1 ne origin1.underlying, origin1)
+ origin1 = origin1.underlying
+ }
+ var origin2 = tp2
+ while (origin2.underlying.isInstanceOf[SingletonType]) {
+ assert(origin2 ne origin2.underlying, origin2)
+ origin2 = origin2.underlying
+ }
+ ((origin1 ne tp1) || (origin2 ne tp2)) && (origin1 =:= origin2)
+ case _ =>
+ false
+ }
+ case _ =>
+ false
+ }
+ }
/** Are `tps1' and `tps2' lists of pairwise equivalent
* types?
diff --git a/src/compiler/scala/tools/nsc/transform/CleanUp.scala b/src/compiler/scala/tools/nsc/transform/CleanUp.scala
index 09ed32253a..cf217dcd23 100644
--- a/src/compiler/scala/tools/nsc/transform/CleanUp.scala
+++ b/src/compiler/scala/tools/nsc/transform/CleanUp.scala
@@ -280,7 +280,7 @@ abstract class CleanUp extends Transform with ast.TreeDSL {
val testForBoolean: Tree = (qual IS_OBJ BoxedBooleanClass.tpe)
val testForNumberOrBoolean = testForNumber OR testForBoolean
- val getPrimitiveReplacementForStructuralCall: Name =>? (Symbol, Tree) = {
+ val getPrimitiveReplacementForStructuralCall: PartialFunction[Name, (Symbol, Tree)] = {
val testsForNumber = Map() ++ List(
nme.UNARY_+ -> "positive",
nme.UNARY_- -> "negate",
diff --git a/src/compiler/scala/tools/nsc/transform/Erasure.scala b/src/compiler/scala/tools/nsc/transform/Erasure.scala
index de9dadbd1f..e92ba64469 100644
--- a/src/compiler/scala/tools/nsc/transform/Erasure.scala
+++ b/src/compiler/scala/tools/nsc/transform/Erasure.scala
@@ -120,16 +120,23 @@ abstract class Erasure extends AddInterfaces with typechecker.Analyzer with ast.
// Compute the erasure of the intersection type with given `parents` according to new spec.
private def intersectionErasure(parents: List[Type]): Type =
if (parents.isEmpty) erasedTypeRef(ObjectClass)
- else {
- // implement new spec for erasure of refined types.
+ else apply {
val psyms = parents map (_.typeSymbol)
- def isUnshadowed(psym: Symbol) =
- !(psyms exists (qsym => (psym ne qsym) && (qsym isNonBottomSubClass psym)))
- val cs = parents.iterator.filter { p => // isUnshadowed is a bit expensive, so try classes first
- val psym = p.typeSymbol
- psym.isClass && !psym.isTrait && isUnshadowed(psym)
+ if (psyms contains ArrayClass) {
+ // treat arrays specially
+ arrayType(
+ intersectionErasure(
+ parents filter (_.typeSymbol == ArrayClass) map (_.typeArgs.head)))
+ } else {
+ // implement new spec for erasure of refined types.
+ def isUnshadowed(psym: Symbol) =
+ !(psyms exists (qsym => (psym ne qsym) && (qsym isNonBottomSubClass psym)))
+ val cs = parents.iterator.filter { p => // isUnshadowed is a bit expensive, so try classes first
+ val psym = p.typeSymbol
+ psym.isClass && !psym.isTrait && isUnshadowed(psym)
+ }
+ (if (cs.hasNext) cs else parents.iterator.filter(p => isUnshadowed(p.typeSymbol))).next()
}
- apply((if (cs.hasNext) cs else parents.iterator.filter(p => isUnshadowed(p.typeSymbol))).next())
}
def apply(tp: Type): Type = {
diff --git a/src/compiler/scala/tools/nsc/transform/SpecializeTypes.scala b/src/compiler/scala/tools/nsc/transform/SpecializeTypes.scala
index 0727835e00..6937658534 100644
--- a/src/compiler/scala/tools/nsc/transform/SpecializeTypes.scala
+++ b/src/compiler/scala/tools/nsc/transform/SpecializeTypes.scala
@@ -739,8 +739,9 @@ abstract class SpecializeTypes extends InfoTransform with TypingTransformers {
override def mapOver(tp: Type): Type = tp match {
case ClassInfoType(parents, decls, clazz) =>
val parents1 = parents mapConserve (this);
- val decls1 = mapOver(decls.toList);
- if ((parents1 eq parents) && (decls1 eq decls)) tp
+ val declsList = decls.toList
+ val decls1 = mapOver(declsList);
+ if ((parents1 eq parents) && (decls1 eq declsList)) tp
else ClassInfoType(parents1, new Scope(decls1), clazz)
case AnnotatedType(annots, atp, selfsym) =>
val annots1 = mapOverAnnotations(annots)
@@ -1254,7 +1255,7 @@ abstract class SpecializeTypes extends InfoTransform with TypingTransformers {
for (tp <- owner.info.memberType(target).typeParams)
yield
if (!env.isDefinedAt(tp))
- typeRef(NoPrefix, from.info.typeParams.find(_ == tp.name).get, Nil)
+ typeRef(NoPrefix, from.info.typeParams.find(_.name == tp.name).get, Nil)
else if ((env(tp) <:< tp.info.bounds.hi) && (tp.info.bounds.lo <:< env(tp)))
env(tp)
else tp.info.bounds.hi
diff --git a/src/compiler/scala/tools/nsc/transform/UnCurry.scala b/src/compiler/scala/tools/nsc/transform/UnCurry.scala
index fd5dd0f9a3..dc475b4173 100644
--- a/src/compiler/scala/tools/nsc/transform/UnCurry.scala
+++ b/src/compiler/scala/tools/nsc/transform/UnCurry.scala
@@ -296,11 +296,11 @@ abstract class UnCurry extends InfoTransform with TypingTransformers {
* }
* new $anon()
*
- * transform a function node (x => body) of type T =>? R where
+ * transform a function node (x => body) of type PartialFunction[T, R] where
* body = expr match { case P_i if G_i => E_i }_i=1..n
* to:
*
- * class $anon() extends Object() with T =>? R with ScalaObject {
+ * class $anon() extends Object() with PartialFunction[T, R] with ScalaObject {
* def apply(x: T): R = (expr: @unchecked) match {
* { case P_i if G_i => E_i }_i=1..n
* def isDefinedAt(x: T): boolean = (x: @unchecked) match {
diff --git a/src/compiler/scala/tools/nsc/typechecker/Contexts.scala b/src/compiler/scala/tools/nsc/typechecker/Contexts.scala
index 2acc09b70e..dd592bb96d 100644
--- a/src/compiler/scala/tools/nsc/typechecker/Contexts.scala
+++ b/src/compiler/scala/tools/nsc/typechecker/Contexts.scala
@@ -180,7 +180,7 @@ trait Contexts { self: Analyzer =>
scope: Scope, imports: List[ImportInfo]): Context = {
val c = new Context
c.unit = unit
- c.tree = sanitize(tree)
+ c.tree = /*sanitize*/(tree) // used to be for IDE
c.owner = owner
c.scope = scope
@@ -464,20 +464,35 @@ trait Contexts { self: Analyzer =>
implicitsCache = null
if (outer != null && outer != this) outer.resetCache
}
- private def collectImplicits(syms: List[Symbol], pre: Type): List[ImplicitInfo] =
- for (sym <- syms if sym.hasFlag(IMPLICIT) && isAccessible(sym, pre, false))
+
+ /** A symbol `sym` qualifies as an implicit if it has the IMPLICIT flag set,
+ * it is accessible, and if it is imported there is not already a local symbol
+ * with the same names. Local symbols override imported ones. This fixes #2866.
+ */
+ private def isQualifyingImplicit(sym: Symbol, pre: Type, imported: Boolean) =
+ sym.hasFlag(IMPLICIT) &&
+ isAccessible(sym, pre, false) &&
+ !(imported && {
+ val e = scope.lookupEntry(sym.name)
+ (e ne null) && (e.owner == scope)
+ })
+
+ private def collectImplicits(syms: List[Symbol], pre: Type, imported: Boolean = false): List[ImplicitInfo] =
+ for (sym <- syms if isQualifyingImplicit(sym, pre, imported))
yield new ImplicitInfo(sym.name, pre, sym)
private def collectImplicitImports(imp: ImportInfo): List[ImplicitInfo] = {
val pre = imp.qual.tpe
def collect(sels: List[ImportSelector]): List[ImplicitInfo] = sels match {
- case List() => List()
- case List(ImportSelector(nme.WILDCARD, _, _, _)) => collectImplicits(pre.implicitMembers, pre)
+ case List() =>
+ List()
+ case List(ImportSelector(nme.WILDCARD, _, _, _)) =>
+ collectImplicits(pre.implicitMembers, pre, imported = true)
case ImportSelector(from, _, to, _) :: sels1 =>
var impls = collect(sels1) filter (info => info.name != from)
if (to != nme.WILDCARD) {
for (sym <- imp.importedSymbol(to).alternatives)
- if (sym.hasFlag(IMPLICIT) && isAccessible(sym, pre, false))
+ if (isQualifyingImplicit(sym, pre, imported = true))
impls = new ImplicitInfo(to, pre, sym) :: impls
}
impls
@@ -488,7 +503,6 @@ trait Contexts { self: Analyzer =>
def implicitss: List[List[ImplicitInfo]] = {
val nextOuter = if (owner.isConstructor) outer.outer.outer else outer
- // can we can do something smarter to bring back the implicit cache?
if (implicitsRunId != currentRunId) {
implicitsRunId = currentRunId
implicitsCache = List()
@@ -545,7 +559,7 @@ trait Contexts { self: Analyzer =>
/** The prefix expression */
def qual: Tree = tree.symbol.info match {
case ImportType(expr) => expr
- case ErrorType => tree
+ case ErrorType => tree setType NoType // fix for #2870
case _ => throw new FatalError("symbol " + tree.symbol + " has bad type: " + tree.symbol.info);//debug
}
@@ -561,16 +575,16 @@ trait Contexts { self: Analyzer =>
var renamed = false
var selectors = tree.selectors
while (selectors != Nil && result == NoSymbol) {
- if (selectors.head.name != nme.WILDCARD)
- notifyImport(name, qual.tpe, selectors.head.name, selectors.head.rename)
+// if (selectors.head.name != nme.WILDCARD) // used to be for IDE
+// notifyImport(name, qual.tpe, selectors.head.name, selectors.head.rename)
if (selectors.head.rename == name.toTermName)
- result = qual.tpe.member(
+ result = qual.tpe.nonLocalMember( // new to address #2733: consider only non-local members for imports
if (name.isTypeName) selectors.head.name.toTypeName else selectors.head.name)
else if (selectors.head.name == name.toTermName)
renamed = true
else if (selectors.head.name == nme.WILDCARD && !renamed)
- result = qual.tpe.member(name)
+ result = qual.tpe.nonLocalMember(name)
selectors = selectors.tail
}
result
diff --git a/src/compiler/scala/tools/nsc/typechecker/Implicits.scala b/src/compiler/scala/tools/nsc/typechecker/Implicits.scala
index 9657cea101..0539b4ee17 100644
--- a/src/compiler/scala/tools/nsc/typechecker/Implicits.scala
+++ b/src/compiler/scala/tools/nsc/typechecker/Implicits.scala
@@ -459,7 +459,9 @@ self: Analyzer =>
if (traceImplicits) println("tvars = "+tvars+"/"+(tvars map (_.constr)))
val targs = solvedTypes(tvars, undetParams, undetParams map varianceInType(pt),
false, lubDepth(List(itree2.tpe, pt)))
- checkBounds(itree2.pos, NoPrefix, NoSymbol, undetParams, targs, "inferred ") // #2421
+
+ // #2421: check that we correctly instantiated type parameters outside of the implicit tree:
+ checkBounds(itree2.pos, NoPrefix, NoSymbol, undetParams, targs, "inferred ")
// filter out failures from type inference, don't want to remove them from undetParams!
// we must be conservative in leaving type params in undetparams
@@ -472,6 +474,18 @@ self: Analyzer =>
val subst = new TreeTypeSubstituter(okParams, okArgs)
subst traverse itree2
+ // #2421b: since type inference (which may have been performed during implicit search)
+ // does not check whether inferred arguments meet the bounds of the corresponding parameter (see note in solvedTypes),
+ // must check again here:
+ itree2 match { // roughly equivalent to typed1(itree2, EXPRmode, wildPt),
+ // since typed1 only forces checking of the outer tree and calls typed on the subtrees
+ // (they have already been type checked, by the typed1(itree...) above, so the subtrees are skipped by typed)
+ // inlining the essential bit here for clarity
+ //TODO: verify that these subtrees don't need re-checking
+ case TypeApply(fun, args) => typedTypeApply(itree2, EXPRmode, fun, args)
+ case _ =>
+ }
+
val result = new SearchResult(itree2, subst)
incCounter(foundImplicits)
if (traceImplicits) println("RESULT = "+result)
diff --git a/src/compiler/scala/tools/nsc/typechecker/Infer.scala b/src/compiler/scala/tools/nsc/typechecker/Infer.scala
index 5c6788f0f6..621b6dce11 100644
--- a/src/compiler/scala/tools/nsc/typechecker/Infer.scala
+++ b/src/compiler/scala/tools/nsc/typechecker/Infer.scala
@@ -159,6 +159,9 @@ trait Infer {
if (!solve(tvars, tparams, variances, upper, depth)) {
// no panic, it's good enough to just guess a solution, we'll find out
// later whether it works.
+// @M danger, Will Robinson! this means that you should never trust inferred type arguments!
+// need to call checkBounds on the args/typars or type1 on the tree for the expression that results from type inference
+// see e.g., #2421: implicit search had been ignoring this caveat
// throw new DeferredNoInstance(() =>
// "no solution exists for constraints"+(tvars map boundsString))
}
@@ -990,12 +993,12 @@ trait Infer {
val specificCount = (if (isAsSpecific(ftpe1, ftpe2)) 1 else 0) -
(if (isAsSpecific(ftpe2, ftpe1) &&
// todo: move to isAsSepecific test
- (!ftpe2.isInstanceOf[OverloadedType] || ftpe1.isInstanceOf[OverloadedType]) &&
+// (!ftpe2.isInstanceOf[OverloadedType] || ftpe1.isInstanceOf[OverloadedType]) &&
(!phase.erasedTypes || covariantReturnOverride(ftpe1, ftpe2))) 1 else 0)
val subClassCount = (if (isInProperSubClassOrObject(sym1, sym2)) 1 else 0) -
(if (isInProperSubClassOrObject(sym2, sym1)) 1 else 0)
- //println("is more specific? "+sym1+sym1.locationString+"/"+sym2+sym2.locationString+":"+
- // specificCount+"/"+subClassCount)
+// println("is more specific? "+sym1+":"+ftpe1+sym1.locationString+"/"+sym2+":"+ftpe2+sym2.locationString+":"+
+// specificCount+"/"+subClassCount)
specificCount + subClassCount > 0
}
}
@@ -1714,6 +1717,7 @@ trait Infer {
})
def improves(sym1: Symbol, sym2: Symbol) =
+// util.trace("improve "+sym1+sym1.locationString+" on "+sym2+sym2.locationString)(
sym2 == NoSymbol || sym2.isError ||
isStrictlyMoreSpecific(followApply(pre.memberType(sym1)),
followApply(pre.memberType(sym2)), sym1, sym2)
diff --git a/src/compiler/scala/tools/nsc/typechecker/Namers.scala b/src/compiler/scala/tools/nsc/typechecker/Namers.scala
index 1955348f91..ae8fd3a956 100644
--- a/src/compiler/scala/tools/nsc/typechecker/Namers.scala
+++ b/src/compiler/scala/tools/nsc/typechecker/Namers.scala
@@ -1188,8 +1188,10 @@ trait Namers { self: Analyzer =>
def checkSelectors(selectors: List[ImportSelector]): Unit = selectors match {
case ImportSelector(from, _, to, _) :: rest =>
if (from != nme.WILDCARD && base != ErrorType) {
- if (base.member(from) == NoSymbol && base.member(from.toTypeName) == NoSymbol)
- context.error(tree.pos, from.decode + " is not a member of " + expr);
+ if (base.nonLocalMember(from) == NoSymbol &&
+ base.nonLocalMember(from.toTypeName) == NoSymbol)
+ context.error(tree.pos, from.decode + " is not a member of " + expr)
+
if (checkNotRedundant(tree.pos, from, to))
checkNotRedundant(tree.pos, from.toTypeName, to.toTypeName)
}
diff --git a/src/compiler/scala/tools/nsc/typechecker/SuperAccessors.scala b/src/compiler/scala/tools/nsc/typechecker/SuperAccessors.scala
index e59b469057..257ab243b4 100644
--- a/src/compiler/scala/tools/nsc/typechecker/SuperAccessors.scala
+++ b/src/compiler/scala/tools/nsc/typechecker/SuperAccessors.scala
@@ -149,7 +149,7 @@ abstract class SuperAccessors extends transform.Transform with transform.TypingT
for (member <- sym.info.members) {
println(member+":"+sym.thisType.memberInfo(member)+"\n"+
toJavaDoc(expandedDocComment(member, sym)))
- for ((useCase, comment) <- useCases(member, sym)) {
+ for ((useCase, comment, pos) <- useCases(member, sym)) {
println("usecase "+useCase+":"+useCase.info)
println(toJavaDoc(comment))
}
diff --git a/src/compiler/scala/tools/nsc/typechecker/SyntheticMethods.scala b/src/compiler/scala/tools/nsc/typechecker/SyntheticMethods.scala
index 676279a8d2..9ae56f05a3 100644
--- a/src/compiler/scala/tools/nsc/typechecker/SyntheticMethods.scala
+++ b/src/compiler/scala/tools/nsc/typechecker/SyntheticMethods.scala
@@ -175,10 +175,13 @@ trait SyntheticMethods extends ast.TreeDSL {
def makeTrees(acc: Symbol, cpt: Type): (Tree, Bind) = {
val varName = context.unit.fresh.newName(clazz.pos.focus, acc.name + "$")
val (eqMethod, binding) =
- if (isRepeatedParamType(cpt)) (nme.sameElements, Star(WILD()))
- else (nme.EQ , WILD() )
-
- ((varName DOT eqMethod)(Ident(acc)), varName BIND binding)
+ if (isRepeatedParamType(cpt))
+ (TypeApply(varName DOT nme.sameElements, List(TypeTree(cpt.baseType(SeqClass).typeArgs.head))),
+ Star(WILD()))
+ else
+ ((varName DOT nme.EQ): Tree,
+ WILD())
+ (eqMethod APPLY Ident(acc), varName BIND binding)
}
// Creates list of parameters and a guard for each
diff --git a/src/compiler/scala/tools/nsc/typechecker/Typers.scala b/src/compiler/scala/tools/nsc/typechecker/Typers.scala
index f29f6fa7a3..515c7ad354 100644
--- a/src/compiler/scala/tools/nsc/typechecker/Typers.scala
+++ b/src/compiler/scala/tools/nsc/typechecker/Typers.scala
@@ -2900,6 +2900,56 @@ trait Typers { self: Analyzer =>
TypeTree(ExistentialType(typeParams, tpe)) setOriginal tree
}
+ // lifted out of typed1 because it's needed in typedImplicit0
+ protected def typedTypeApply(tree: Tree, mode: Int, fun: Tree, args: List[Tree]): Tree = fun.tpe match {
+ case OverloadedType(pre, alts) =>
+ inferPolyAlternatives(fun, args map (_.tpe))
+ val tparams = fun.symbol.typeParams //@M TODO: fun.symbol.info.typeParams ? (as in typedAppliedTypeTree)
+ val args1 = if(args.length == tparams.length) {
+ //@M: in case TypeApply we can't check the kind-arities of the type arguments,
+ // as we don't know which alternative to choose... here we do
+ map2Conserve(args, tparams) {
+ //@M! the polytype denotes the expected kind
+ (arg, tparam) => typedHigherKindedType(arg, mode, polyType(tparam.typeParams, AnyClass.tpe))
+ }
+ } else // @M: there's probably something wrong when args.length != tparams.length... (triggered by bug #320)
+ // Martin, I'm using fake trees, because, if you use args or arg.map(typedType),
+ // inferPolyAlternatives loops... -- I have no idea why :-(
+ // ...actually this was looping anyway, see bug #278.
+ return errorTree(fun, "wrong number of type parameters for "+treeSymTypeMsg(fun))
+
+ typedTypeApply(tree, mode, fun, args1)
+ case SingleType(_, _) =>
+ typedTypeApply(tree, mode, fun setType fun.tpe.widen, args)
+ case PolyType(tparams, restpe) if (tparams.length != 0) =>
+ if (tparams.length == args.length) {
+ val targs = args map (_.tpe)
+ checkBounds(tree.pos, NoPrefix, NoSymbol, tparams, targs, "")
+ if (fun.symbol == Predef_classOf) {
+ checkClassType(args.head, true, false)
+ atPos(tree.pos) { gen.mkClassOf(targs.head) }
+ } else {
+ if (phase.id <= currentRun.typerPhase.id &&
+ fun.symbol == Any_isInstanceOf && !targs.isEmpty)
+ checkCheckable(tree.pos, targs.head, "")
+ val resultpe = restpe.instantiateTypeParams(tparams, targs)
+ //@M substitution in instantiateParams needs to be careful!
+ //@M example: class Foo[a] { def foo[m[x]]: m[a] = error("") } (new Foo[Int]).foo[List] : List[Int]
+ //@M --> first, m[a] gets changed to m[Int], then m gets substituted for List,
+ // this must preserve m's type argument, so that we end up with List[Int], and not List[a]
+ //@M related bug: #1438
+ //println("instantiating type params "+restpe+" "+tparams+" "+targs+" = "+resultpe)
+ treeCopy.TypeApply(tree, fun, args) setType resultpe
+ }
+ } else {
+ errorTree(tree, "wrong number of type parameters for "+treeSymTypeMsg(fun))
+ }
+ case ErrorType =>
+ setError(tree)
+ case _ =>
+ errorTree(tree, treeSymTypeMsg(fun)+" does not take type parameters.")
+ }
+
/**
* @param tree ...
* @param mode ...
@@ -3184,55 +3234,6 @@ trait Typers { self: Analyzer =>
errorTree(expr1, "_ must follow method; cannot follow " + expr1.tpe)
}
- def typedTypeApply(fun: Tree, args: List[Tree]): Tree = fun.tpe match {
- case OverloadedType(pre, alts) =>
- inferPolyAlternatives(fun, args map (_.tpe))
- val tparams = fun.symbol.typeParams //@M TODO: fun.symbol.info.typeParams ? (as in typedAppliedTypeTree)
- val args1 = if(args.length == tparams.length) {
- //@M: in case TypeApply we can't check the kind-arities of the type arguments,
- // as we don't know which alternative to choose... here we do
- map2Conserve(args, tparams) {
- //@M! the polytype denotes the expected kind
- (arg, tparam) => typedHigherKindedType(arg, mode, polyType(tparam.typeParams, AnyClass.tpe))
- }
- } else // @M: there's probably something wrong when args.length != tparams.length... (triggered by bug #320)
- // Martin, I'm using fake trees, because, if you use args or arg.map(typedType),
- // inferPolyAlternatives loops... -- I have no idea why :-(
- // ...actually this was looping anyway, see bug #278.
- return errorTree(fun, "wrong number of type parameters for "+treeSymTypeMsg(fun))
-
- typedTypeApply(fun, args1)
- case SingleType(_, _) =>
- typedTypeApply(fun setType fun.tpe.widen, args)
- case PolyType(tparams, restpe) if (tparams.length != 0) =>
- if (tparams.length == args.length) {
- val targs = args map (_.tpe)
- checkBounds(tree.pos, NoPrefix, NoSymbol, tparams, targs, "")
- if (fun.symbol == Predef_classOf) {
- checkClassType(args.head, true, false)
- atPos(tree.pos) { gen.mkClassOf(targs.head) }
- } else {
- if (phase.id <= currentRun.typerPhase.id &&
- fun.symbol == Any_isInstanceOf && !targs.isEmpty)
- checkCheckable(tree.pos, targs.head, "")
- val resultpe = restpe.instantiateTypeParams(tparams, targs)
- //@M substitution in instantiateParams needs to be careful!
- //@M example: class Foo[a] { def foo[m[x]]: m[a] = error("") } (new Foo[Int]).foo[List] : List[Int]
- //@M --> first, m[a] gets changed to m[Int], then m gets substituted for List,
- // this must preserve m's type argument, so that we end up with List[Int], and not List[a]
- //@M related bug: #1438
- //println("instantiating type params "+restpe+" "+tparams+" "+targs+" = "+resultpe)
- treeCopy.TypeApply(tree, fun, args) setType resultpe
- }
- } else {
- errorTree(tree, "wrong number of type parameters for "+treeSymTypeMsg(fun))
- }
- case ErrorType =>
- setError(tree)
- case _ =>
- errorTree(tree, treeSymTypeMsg(fun)+" does not take type parameters.")
- }
-
/**
* @param args ...
* @return ...
@@ -3922,7 +3923,7 @@ trait Typers { self: Analyzer =>
}
//@M TODO: context.undetparams = undets_fun ?
- typedTypeApply(fun1, args1)
+ typedTypeApply(tree, mode, fun1, args1)
case Apply(Block(stats, expr), args) =>
typed1(atPos(tree.pos)(Block(stats, Apply(expr, args))), mode, pt)
diff --git a/src/compiler/scala/tools/nsc/util/ClassPath.scala b/src/compiler/scala/tools/nsc/util/ClassPath.scala
index a906aa40c3..a6b0a1244d 100644
--- a/src/compiler/scala/tools/nsc/util/ClassPath.scala
+++ b/src/compiler/scala/tools/nsc/util/ClassPath.scala
@@ -90,7 +90,7 @@ object ClassPath {
}
/**
- * A represents classes which can be loaded with a ClassfileLoader/MSILTypeLoader
+ * Represents classes which can be loaded with a ClassfileLoader/MSILTypeLoader
* and / or a SourcefileLoader.
*/
case class ClassRep[T](binary: Option[T], source: Option[AbstractFile]) {
@@ -124,9 +124,9 @@ abstract class ClassPath[T] {
* The short name of the package (without prefix)
*/
def name: String
- def classes: List[ClassRep[T]]
- def packages: List[ClassPath[T]]
- def sourcepaths: List[AbstractFile]
+ val classes: List[ClassRep[T]]
+ val packages: List[ClassPath[T]]
+ val sourcepaths: List[AbstractFile]
/**
* Find a ClassRep given a class name of the form "package.subpackage.ClassName".
@@ -150,7 +150,7 @@ abstract class ClassPath[T] {
class SourcePath[T](dir: AbstractFile) extends ClassPath[T] {
def name = dir.name
- def classes = {
+ lazy val classes = {
val cls = new ListBuffer[ClassRep[T]]
for (f <- dir.iterator) {
if (!f.isDirectory && ClassPath.validSourceFile(f.name))
@@ -159,7 +159,7 @@ class SourcePath[T](dir: AbstractFile) extends ClassPath[T] {
cls.toList
}
- def packages = {
+ lazy val packages = {
val pkg = new ListBuffer[SourcePath[T]]
for (f <- dir.iterator) {
if (f.isDirectory && ClassPath.validPackage(f.name))
@@ -168,7 +168,7 @@ class SourcePath[T](dir: AbstractFile) extends ClassPath[T] {
pkg.toList
}
- def sourcepaths: List[AbstractFile] = List(dir)
+ val sourcepaths: List[AbstractFile] = List(dir)
override def toString() = "sourcepath: "+ dir.toString()
}
@@ -179,7 +179,7 @@ class SourcePath[T](dir: AbstractFile) extends ClassPath[T] {
class DirectoryClassPath(dir: AbstractFile) extends ClassPath[AbstractFile] {
def name = dir.name
- def classes = {
+ lazy val classes = {
val cls = new ListBuffer[ClassRep[AbstractFile]]
for (f <- dir.iterator) {
if (!f.isDirectory && ClassPath.validClassFile(f.name))
@@ -188,7 +188,7 @@ class DirectoryClassPath(dir: AbstractFile) extends ClassPath[AbstractFile] {
cls.toList
}
- def packages = {
+ lazy val packages = {
val pkg = new ListBuffer[DirectoryClassPath]
for (f <- dir.iterator) {
if (f.isDirectory && ClassPath.validPackage(f.name))
@@ -197,7 +197,7 @@ class DirectoryClassPath(dir: AbstractFile) extends ClassPath[AbstractFile] {
pkg.toList
}
- def sourcepaths: List[AbstractFile] = Nil
+ val sourcepaths: List[AbstractFile] = Nil
override def toString() = "directory classpath: "+ dir.toString()
}
@@ -230,7 +230,7 @@ class AssemblyClassPath(types: Array[MSILType], namespace: String) extends Class
if (types(m).FullName.startsWith(namespace)) m else types.length
}
- def classes = {
+ lazy val classes = {
val cls = new ListBuffer[ClassRep[MSILType]]
var i = first
while (i < types.length && types(i).Namespace.startsWith(namespace)) {
@@ -242,7 +242,7 @@ class AssemblyClassPath(types: Array[MSILType], namespace: String) extends Class
cls.toList
}
- def packages = {
+ lazy val packages = {
val nsSet = new MutHashSet[String]
var i = first
while (i < types.length && types(i).Namespace.startsWith(namespace)) {
@@ -260,7 +260,7 @@ class AssemblyClassPath(types: Array[MSILType], namespace: String) extends Class
yield new AssemblyClassPath(types, ns)
}
- def sourcepaths: List[AbstractFile] = Nil
+ val sourcepaths: List[AbstractFile] = Nil
override def toString() = "assembly classpath "+ namespace
}
@@ -273,7 +273,7 @@ abstract class MergedClassPath[T] extends ClassPath[T] {
def name = entries.head.name
- def classes: List[ClassRep[T]] = {
+ lazy val classes: List[ClassRep[T]] = {
val cls = new ListBuffer[ClassRep[T]]
for (e <- entries; c <- e.classes) {
val name = c.name
@@ -291,7 +291,7 @@ abstract class MergedClassPath[T] extends ClassPath[T] {
cls.toList
}
- def packages: List[ClassPath[T]] = {
+ lazy val packages: List[ClassPath[T]] = {
val pkg = new ListBuffer[ClassPath[T]]
for (e <- entries; p <- e.packages) {
val name = p.name
@@ -305,7 +305,7 @@ abstract class MergedClassPath[T] extends ClassPath[T] {
pkg.toList
}
- def sourcepaths: List[AbstractFile] = entries.flatMap(_.sourcepaths)
+ lazy val sourcepaths: List[AbstractFile] = entries.flatMap(_.sourcepaths)
private def addPackage(to: ClassPath[T], pkg: ClassPath[T]) = to match {
case cp: MergedClassPath[_] =>
diff --git a/src/compiler/scala/tools/nsc/util/Statistics.scala b/src/compiler/scala/tools/nsc/util/Statistics.scala
index eb8a25a6ed..188f2fcdb2 100644
--- a/src/compiler/scala/tools/nsc/util/Statistics.scala
+++ b/src/compiler/scala/tools/nsc/util/Statistics.scala
@@ -10,35 +10,53 @@ package util
object Statistics {
- var enabled = false
+ private var _enabled = false
+
+ def enabled = _enabled
+ def enabled_=(cond: Boolean) = {
+ if (cond && !_enabled) {
+ val test = new Timer()
+ val start = System.nanoTime()
+ var total = 0L
+ for (i <- 1 to 10000) {
+ val time = System.nanoTime()
+ total += System.nanoTime() - time
+ }
+ val total2 = System.nanoTime() - start
+ println("Enabling statistics, measuring overhead = "+
+ total/10000.0+"ns to "+total2/10000.0+"ns per timer")
+ _enabled = true
+ }
+ }
+
var phasesShown = List("parser", "typer", "erasure", "cleanup")
def currentTime() =
- if (enabled) System.nanoTime() else 0L
+ if (_enabled) System.nanoTime() else 0L
private def showPercent(x: Double, base: Double) =
if (base == 0) "" else " ("+"%2.1f".format(x / base * 100)+"%)"
def incCounter(c: Counter) {
- if (enabled) c.value += 1
+ if (_enabled) c.value += 1
}
def incCounter(c: Counter, delta: Int) {
- if (enabled) c.value += delta
+ if (_enabled) c.value += delta
}
def startCounter(sc: SubCounter): IntPair =
- if (enabled) sc.start() else null
+ if (_enabled) sc.start() else null
def stopCounter(sc: SubCounter, start: IntPair) {
- if (enabled) sc.stop(start)
+ if (_enabled) sc.stop(start)
}
def startTimer(tm: Timer): LongPair =
- if (enabled) tm.start() else null
+ if (_enabled) tm.start() else null
def stopTimer(tm: Timer, start: LongPair) {
- if (enabled) tm.stop(start)
+ if (_enabled) tm.stop(start)
}
case class IntPair(x: Int, y: Int)
@@ -52,9 +70,9 @@ object Statistics {
class SubCounter(c: Counter) {
var value: Int = 0
def start(): IntPair =
- if (enabled) IntPair(value, c.value) else null
+ if (_enabled) IntPair(value, c.value) else null
def stop(prev: IntPair) {
- if (enabled) {
+ if (_enabled) {
val IntPair(value0, cvalue0) = prev
value = value0 + c.value - cvalue0
}
@@ -64,16 +82,21 @@ object Statistics {
}
class Timer {
- var nanos: Long = 0L
+ var nanos: Long = 0
+ var timings = 0
def start(): LongPair =
- if (enabled) LongPair(nanos, System.nanoTime()) else null
+ if (_enabled) {
+ timings += 1
+ LongPair(nanos, System.nanoTime())
+ } else null
def stop(prev: LongPair) {
- if (enabled) {
+ if (_enabled) {
val LongPair(nanos0, start) = prev
nanos = nanos0 + System.nanoTime() - start
+ timings += 1
}
}
- override def toString = nanos.toString+"ns"
+ override def toString = (timings/2)+" spans, "+nanos.toString+"ns"
}
class ClassCounts extends scala.collection.mutable.HashMap[Class[_], Int] {
@@ -82,6 +105,11 @@ object Statistics {
var nodeByType = new ClassCounts
+ var microsByType = new ClassCounts
+ var visitsByType = new ClassCounts
+ var pendingTreeTypes: List[Class[_]] = List()
+ var typerTime: Long = 0L
+
val singletonBaseTypeSeqCount = new Counter
val compoundBaseTypeSeqCount = new Counter
val typerefBaseTypeSeqCount = new Counter
@@ -137,8 +165,11 @@ object Statistics {
val subtypeImprovCount = new SubCounter(subtypeCount)
val subtypeETNanos = new Timer
val matchesPtNanos = new Timer
- val counter1: SubCounter = new SubCounter(findMemberCount)
- val counter2: SubCounter = new SubCounter(findMemberCount)
+ val ctr1 = new Counter
+ val ctr2 = new Counter
+ val ctr3 = new Counter
+ val counter1: SubCounter = new SubCounter(subtypeCount)
+ val counter2: SubCounter = new SubCounter(subtypeCount)
val timer1: Timer = new Timer
val timer2: Timer = new Timer
}
@@ -159,7 +190,7 @@ abstract class Statistics {
value+showPercent(value, base)
def showRelTyper(timer: Timer) =
- timer.nanos+"ns"+showPercent(timer.nanos, typerNanos.nanos)
+ timer+showPercent(timer.nanos, typerNanos.nanos)
def showCounts(counts: ClassCounts) =
counts.toSeq.sortWith(_._2 > _._2).map {
@@ -219,6 +250,11 @@ abstract class Statistics {
inform("time spent in failed : "+showRelTyper(failedSilentNanos))
inform(" failed apply : "+showRelTyper(failedApplyNanos))
inform(" failed op= : "+showRelTyper(failedOpEqNanos))
+ inform("micros by tree node : "+showCounts(microsByType))
+ inform("#visits by tree node : "+showCounts(visitsByType))
+ val average = new ClassCounts
+ for (c <- microsByType.keysIterator) average(c) = microsByType(c)/visitsByType(c)
+ inform("avg micros by tree node : "+showCounts(average))
inform("time spent in <:< : "+showRelTyper(subtypeNanos))
inform("time spent in findmember : "+showRelTyper(findMemberNanos))
inform("time spent in asSeenFrom : "+showRelTyper(asSeenFromNanos))
@@ -229,6 +265,9 @@ abstract class Statistics {
inform("#implicit oftype hits : " + oftypeImplicitHits)
}
+ if (ctr1 != null) inform("#ctr1 : " + ctr1)
+ if (ctr2 != null) inform("#ctr2 : " + ctr2)
+ if (ctr3 != null) inform("#ctr3 : " + ctr3)
if (counter1 != null) inform("#counter1 : " + counter1)
if (counter2 != null) inform("#counter2 : " + counter2)
if (timer1 != null) inform("#timer1 : " + timer1)
diff --git a/src/library/scala/Option.scala b/src/library/scala/Option.scala
index 6cb0fec929..8511fa78a5 100644
--- a/src/library/scala/Option.scala
+++ b/src/library/scala/Option.scala
@@ -110,7 +110,7 @@ sealed abstract class Option[+A] extends Product {
*
* @param pf the partial function.
*/
- def partialMap[B](pf: A =>? B): Option[B] =
+ def partialMap[B](pf: PartialFunction[A, B]): Option[B] =
if (!isEmpty && pf.isDefinedAt(this.get)) Some(pf(this.get)) else None
/** If the option is nonempty return it,
diff --git a/src/library/scala/PartialFunction.scala b/src/library/scala/PartialFunction.scala
index f62fa68565..f450596e57 100644
--- a/src/library/scala/PartialFunction.scala
+++ b/src/library/scala/PartialFunction.scala
@@ -38,7 +38,7 @@ trait PartialFunction[-A, +B] extends (A => B) {
* of this partial function and `that`. The resulting partial function
* takes `x` to `this(x)` where `this` is defined, and to `that(x)` where it is not.
*/
- def orElse[A1 <: A, B1 >: B](that: A1 =>? B1) : A1 =>? B1 =
+ def orElse[A1 <: A, B1 >: B](that: PartialFunction[A1, B1]) : PartialFunction[A1, B1] =
new PartialFunction[A1, B1] {
def isDefinedAt(x: A1): Boolean =
PartialFunction.this.isDefinedAt(x) || that.isDefinedAt(x)
@@ -54,7 +54,7 @@ trait PartialFunction[-A, +B] extends (A => B) {
* @return a partial function with the same domain as this partial function, which maps
* arguments `x` to `k(this(x))`.
*/
- override def andThen[C](k: B => C): A =>? C = new PartialFunction[A, C] {
+ override def andThen[C](k: B => C) : PartialFunction[A, C] = new PartialFunction[A, C] {
def isDefinedAt(x: A): Boolean = PartialFunction.this.isDefinedAt(x)
def apply(x: A): C = k(PartialFunction.this.apply(x))
}
@@ -92,7 +92,7 @@ object PartialFunction
* @param pf the partial function
* @return true, iff `x` is in the domain of `pf` and `pf(x) == true`.
*/
- def cond[T](x: T)(pf: T =>? Boolean): Boolean =
+ def cond[T](x: T)(pf: PartialFunction[T, Boolean]): Boolean =
(pf isDefinedAt x) && pf(x)
/** Transforms a PartialFunction[T, U] `pf' into Function1[T, Option[U]] `f'
@@ -104,6 +104,6 @@ object PartialFunction
* @param pf the PartialFunction[T, U]
* @return `Some(pf(x))` if `pf isDefinedAt x`, `None` otherwise.
*/
- def condOpt[T,U](x: T)(pf: T =>? U): Option[U] =
+ def condOpt[T,U](x: T)(pf: PartialFunction[T, U]): Option[U] =
if (pf isDefinedAt x) Some(pf(x)) else None
}
diff --git a/src/library/scala/Predef.scala b/src/library/scala/Predef.scala
index 1b9f4b75ba..5684c91aaa 100644
--- a/src/library/scala/Predef.scala
+++ b/src/library/scala/Predef.scala
@@ -117,7 +117,7 @@ object Predef extends LowPriorityImplicits {
throw new IllegalArgumentException("requirement failed: "+ message)
}
- class Ensuring[A](x: A) {
+ final class Ensuring[A](val x: A) {
def ensuring(cond: Boolean): A = { assert(cond); x }
def ensuring(cond: Boolean, msg: Any): A = { assert(cond, msg); x }
def ensuring(cond: A => Boolean): A = { assert(cond(x)); x }
@@ -139,7 +139,7 @@ object Predef extends LowPriorityImplicits {
def unapply[A, B, C](x: Tuple3[A, B, C]): Option[Tuple3[A, B, C]] = Some(x)
}
- class ArrowAssoc[A](x: A) {
+ final class ArrowAssoc[A](val x: A) {
@inline def -> [B](y: B): Tuple2[A, B] = Tuple2(x, y)
def →[B](y: B): Tuple2[A, B] = ->(y)
}
diff --git a/src/library/scala/collection/IndexedSeqViewLike.scala b/src/library/scala/collection/IndexedSeqViewLike.scala
index 06fa6c8953..07f63ad2b0 100644
--- a/src/library/scala/collection/IndexedSeqViewLike.scala
+++ b/src/library/scala/collection/IndexedSeqViewLike.scala
@@ -24,28 +24,55 @@ import TraversableView.NoBuilder
trait IndexedSeqViewLike[+A,
+Coll,
+This <: IndexedSeqView[A, Coll] with IndexedSeqViewLike[A, Coll, This]]
- extends IndexedSeq[A]
- with IndexedSeqLike[A, This]
- with SeqView[A, Coll]
- with SeqViewLike[A, Coll, This]
- with views.IndexedSeqTransformations[A, Coll, This]
+ extends IndexedSeq[A] with IndexedSeqLike[A, This] with SeqView[A, Coll] with SeqViewLike[A, Coll, This]
{ self =>
- trait Transformed[+B] extends views.IndexedSeqLike[B, Coll] with super.Transformed[B]
+ trait Transformed[+B] extends IndexedSeqView[B, Coll] with super.Transformed[B]
- trait Sliced extends Transformed[A] with super.Sliced
- trait Mapped[B] extends Transformed[B] with super.Mapped[B]
- trait FlatMapped[B] extends Transformed[B] with super.FlatMapped[B]
- trait Appended[B >: A] extends Transformed[B] with super.Appended[B]
- trait Filtered extends Transformed[A] with super.Filtered
- trait TakenWhile extends Transformed[A] with super.TakenWhile
- trait DroppedWhile extends Transformed[A] with super.DroppedWhile
- trait Reversed extends Transformed[A] with super.Reversed
- trait Patched[B >: A] extends Transformed[B] with super.Patched[B]
+ trait Sliced extends Transformed[A] with super.Sliced {
+ /** Override to use IndexedSeq's foreach; todo: see whether this is really faster */
+ override def foreach[U](f: A => U) = super[Transformed].foreach(f)
+ }
+
+ trait Mapped[B] extends Transformed[B] with super.Mapped[B] {
+ override def foreach[U](f: B => U) = super[Transformed].foreach(f)
+ }
+
+ trait FlatMapped[B] extends Transformed[B] with super.FlatMapped[B] {
+ override def foreach[U](f: B => U) = super[Transformed].foreach(f)
+ }
+
+ trait Appended[B >: A] extends Transformed[B] with super.Appended[B] {
+ override def foreach[U](f: B => U) = super[Transformed].foreach(f)
+ }
+
+ trait Filtered extends Transformed[A] with super.Filtered {
+ override def foreach[U](f: A => U) = super[Transformed].foreach(f)
+ }
+
+ trait TakenWhile extends Transformed[A] with super.TakenWhile {
+ override def foreach[U](f: A => U) = super[Transformed].foreach(f)
+ }
+
+ trait DroppedWhile extends Transformed[A] with super.DroppedWhile {
+ override def foreach[U](f: A => U) = super[Transformed].foreach(f)
+ }
+
+ trait Reversed extends Transformed[A] with super.Reversed {
+ override def foreach[U](f: A => U) = super[Transformed].foreach(f)
+ }
+
+ trait Patched[B >: A] extends Transformed[B] with super.Patched[B] {
+ override def foreach[U](f: B => U) = super[Transformed].foreach(f)
+ }
trait Zipped[B] extends Transformed[(A, B)] {
protected[this] val other: Iterable[B]
- def length = self.length min other.size
+ /** Have to be careful here - other may be an infinite sequence. */
+ def length =
+ if (other.hasDefiniteSize) self.length min other.size
+ else other take self.length size
+
def apply(idx: Int): (A, B) = (self.apply(idx), other.iterator drop idx next)
override def stringPrefix = self.stringPrefix+"Z"
}
@@ -65,5 +92,22 @@ trait IndexedSeqViewLike[+A,
}
override def stringPrefix = self.stringPrefix+"Z"
}
+
+ /** Boilerplate method, to override in each subclass
+ * This method could be eliminated if Scala had virtual classes
+ */
+ protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
+ protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
+ protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
+ protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
+ protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
+ protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
+ protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
+ protected override def newZipped[B](that: Iterable[B]): Transformed[(A, B)] = new Zipped[B] { val other = that }
+ protected override def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] = new ZippedAll[A1, B] { val other = that; val thisElem = _thisElem; val thatElem = _thatElem }
+ protected override def newReversed: Transformed[A] = new Reversed { }
+ protected override def newPatched[B >: A](_from: Int, _patch: Seq[B], _replaced: Int): Transformed[B] = new Patched[B] {
+ val from = _from; val patch = _patch; val replaced = _replaced
+ }
override def stringPrefix = "IndexedSeqView"
}
diff --git a/src/library/scala/collection/IterableViewLike.scala b/src/library/scala/collection/IterableViewLike.scala
index 831a244352..27323294c4 100644
--- a/src/library/scala/collection/IterableViewLike.scala
+++ b/src/library/scala/collection/IterableViewLike.scala
@@ -24,14 +24,10 @@ import TraversableView.NoBuilder
trait IterableViewLike[+A,
+Coll,
+This <: IterableView[A, Coll] with IterableViewLike[A, Coll, This]]
- extends Iterable[A]
- with IterableLike[A, This]
- with TraversableView[A, Coll]
- with TraversableViewLike[A, Coll, This]
- with views.IterableTransformations[A, Coll, This]
+extends Iterable[A] with IterableLike[A, This] with TraversableView[A, Coll] with TraversableViewLike[A, Coll, This]
{ self =>
- trait Transformed[+B] extends views.IterableLike[B, Coll] with super.Transformed[B]
+ trait Transformed[+B] extends IterableView[B, Coll] with super.Transformed[B]
trait Sliced extends Transformed[A] with super.Sliced {
override def iterator = self.iterator slice (from, until)
@@ -88,5 +84,25 @@ trait IterableViewLike[+A,
override def zipAll[B, A1 >: A, That](that: Iterable[B], thisElem: A1, thatElem: B)(implicit bf: CanBuildFrom[This, (A1, B), That]): That =
newZippedAll(that, thisElem, thatElem).asInstanceOf[That]
+ protected def newZipped[B](that: Iterable[B]): Transformed[(A, B)] = new Zipped[B] {
+ val other = that
+ }
+ protected def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] = new ZippedAll[A1, B] {
+ val other: Iterable[B] = that
+ val thisElem = _thisElem
+ val thatElem = _thatElem
+ }
+
+ /** Boilerplate method, to override in each subclass
+ * This method could be eliminated if Scala had virtual classes
+ */
+ protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
+ protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
+ protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
+ protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
+ protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
+ protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
+ protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
+
override def stringPrefix = "IterableView"
}
diff --git a/src/library/scala/collection/Iterator.scala b/src/library/scala/collection/Iterator.scala
index 5f28161ed3..c23765c9bc 100644
--- a/src/library/scala/collection/Iterator.scala
+++ b/src/library/scala/collection/Iterator.scala
@@ -411,7 +411,7 @@ trait Iterator[+A] { self =>
* @return a new iterator which yields each value `x` produced by this iterator for
* which `pf` is defined the image `pf(x)`.
*/
- def partialMap[B](pf: A =>? B): Iterator[B] = {
+ def partialMap[B](pf: PartialFunction[A, B]): Iterator[B] = {
val self = buffered
new Iterator[B] {
private def skip() = while (self.hasNext && !pf.isDefinedAt(self.head)) self.next()
@@ -1112,9 +1112,10 @@ trait Iterator[+A] { self =>
res.toList
}
- /** Traverses this iterator and returns all produced values in a list.
+ /** Lazily wraps a Stream around this iterator so its values are memoized.
*
- * @return a stream which contains all values produced by this iterator.
+ * @return a Stream which can repeatedly produce all the values
+ * produced by this iterator.
*/
def toStream: Stream[A] =
if (hasNext) Stream.cons(next, toStream) else Stream.empty
diff --git a/src/library/scala/collection/JavaConversions.scala b/src/library/scala/collection/JavaConversions.scala
index 15005f71ba..7af138067b 100644
--- a/src/library/scala/collection/JavaConversions.scala
+++ b/src/library/scala/collection/JavaConversions.scala
@@ -53,6 +53,7 @@ package scala.collection
*/
object JavaConversions {
import java.{ lang => jl, util => ju }
+ import java.util.{ concurrent => juc }
import scala.collection.{ generic, immutable, mutable, Traversable }
import scala.reflect.ClassManifest
@@ -178,10 +179,17 @@ object JavaConversions {
* @return A Java <code>Map</code> view of the argument.
*/
implicit def asMap[A, B](m : mutable.Map[A, B])(implicit ma : ClassManifest[A]) : ju.Map[A, B] = m match {
+ //case JConcurrentMapWrapper(wrapped) => wrapped
case JMapWrapper(wrapped) => wrapped
case _ => new MutableMapWrapper(m)(ma)
}
+ implicit def asConcurrentMap[A, B](m: mutable.ConcurrentMap[A, B])
+ (implicit ma: ClassManifest[A], mb: ClassManifest[B]): juc.ConcurrentMap[A, B] = m match {
+ case JConcurrentMapWrapper(wrapped) => wrapped
+ case _ => new ConcurrentMapWrapper(m)(ma, mb)
+ }
+
// Java => Scala
/**
@@ -303,8 +311,31 @@ object JavaConversions {
* @return A Scala mutable <code>Map</code> view of the argument.
*/
implicit def asMap[A, B](m : ju.Map[A, B]) = m match {
+ //case ConcurrentMapWrapper(wrapped) => wrapped
case MutableMapWrapper(wrapped) => wrapped
- case _ =>new JMapWrapper(m)
+ case _ => new JMapWrapper(m)
+ }
+
+ /**
+ * Implicitly converts a Java <code>ConcurrentMap</code> to a Scala mutable <code>ConcurrentMap</code>.
+ * The returned Scala <code>ConcurrentMap</code> is backed by the provided Java
+ * <code>ConcurrentMap</code> and any side-effects of using it via the Scala interface will
+ * be visible via the Java interface and vice versa.
+ * <p>
+ * If the Java <code>ConcurrentMap</code> was previously obtained from an implicit or
+ * explicit call of <code>asConcurrentMap(scala.collection.mutable.ConcurrentMap)</code> then the original
+ * Scala <code>ConcurrentMap</code> will be returned.
+ *
+ * @param m The <code>ConcurrentMap</code> to be converted.
+ * @return A Scala mutable <code>ConcurrrentMap</code> view of the argument.
+ */
+ implicit def asConcurrentMap[A, B](m: juc.ConcurrentMap[A, B]) = m match {
+ case ConcurrentMapWrapper(wrapped) => wrapped
+ case _ => new JConcurrentMapWrapper(m)
+ }
+
+ implicit def asMap(p: ju.Properties): mutable.Map[String, String] = p match {
+ case _ => new JPropertiesWrapper(p)
}
// Private implementations ...
@@ -410,7 +441,8 @@ object JavaConversions {
override def empty = JSetWrapper(new ju.HashSet[A])
}
- case class MutableMapWrapper[A, B](underlying : mutable.Map[A, B])(m : ClassManifest[A]) extends ju.AbstractMap[A, B] {
+ abstract class MutableMapWrapperLike[A, B](underlying: mutable.Map[A, B])(m: ClassManifest[A])
+ extends ju.AbstractMap[A, B] {
self =>
override def size = underlying.size
@@ -462,7 +494,12 @@ object JavaConversions {
}
}
- case class JMapWrapper[A, B](underlying : ju.Map[A, B]) extends mutable.Map[A, B] with mutable.MapLike[A, B, JMapWrapper[A, B]] {
+ case class MutableMapWrapper[A, B](underlying : mutable.Map[A, B])(m : ClassManifest[A])
+ extends MutableMapWrapperLike[A, B](underlying)(m)
+
+ abstract class JMapWrapperLike[A, B, +Repr <: mutable.MapLike[A, B, Repr] with mutable.Map[A, B]]
+ (underlying: ju.Map[A, B])
+ extends mutable.Map[A, B] with mutable.MapLike[A, B, Repr] {
override def size = underlying.size
def get(k : A) = {
@@ -498,6 +535,127 @@ object JavaConversions {
override def clear = underlying.clear
+ override def empty: Repr = null.asInstanceOf[Repr]
+ }
+
+ case class JMapWrapper[A, B](underlying : ju.Map[A, B])
+ extends JMapWrapperLike[A, B, JMapWrapper[A, B]](underlying) {
override def empty = JMapWrapper(new ju.HashMap[A, B])
}
+
+ case class ConcurrentMapWrapper[A, B](underlying: mutable.ConcurrentMap[A, B])
+ (m: ClassManifest[A], mv: ClassManifest[B])
+ extends MutableMapWrapperLike[A, B](underlying)(m) with juc.ConcurrentMap[A, B] {
+ self =>
+
+ override def remove(k : AnyRef) = {
+ if (!m.erasure.isInstance(k))
+ null.asInstanceOf[B]
+ else {
+ val k1 = k.asInstanceOf[A]
+ underlying.remove(k1) match {
+ case Some(v) => v
+ case None => null.asInstanceOf[B]
+ }
+ }
+ }
+
+ def putIfAbsent(k: A, v: B) = underlying.putIfAbsent(k, v) match {
+ case Some(v) => v
+ case None => null.asInstanceOf[B]
+ }
+
+ def remove(k: AnyRef, v: AnyRef) = {
+ if (!m.erasure.isInstance(k) || !mv.erasure.isInstance(v))
+ false
+ else {
+ val k1 = k.asInstanceOf[A]
+ val v1 = v.asInstanceOf[B]
+ underlying.remove(k1, v1)
+ }
+ }
+
+ def replace(k: A, v: B): B = underlying.replace(k, v) match {
+ case Some(v) => v
+ case None => null.asInstanceOf[B]
+ }
+
+ def replace(k: A, oldval: B, newval: B) = underlying.replace(k, oldval, newval)
+
+ }
+
+ case class JConcurrentMapWrapper[A, B](underlying: juc.ConcurrentMap[A, B])
+ extends JMapWrapperLike[A, B, JConcurrentMapWrapper[A, B]](underlying) with mutable.ConcurrentMap[A, B] {
+ override def get(k: A) = {
+ val v = underlying.get(k)
+ if (v != null) Some(v)
+ else None
+ }
+
+ override def empty = new JConcurrentMapWrapper(new juc.ConcurrentHashMap[A, B])
+
+ def putIfAbsent(k: A, v: B): Option[B] = {
+ val r = underlying.putIfAbsent(k, v)
+ if (r != null) Some(r) else None
+ }
+
+ def remove(k: A, v: B): Boolean = underlying.remove(k, v)
+
+ def replace(k: A, v: B): Option[B] = {
+ val prev = underlying.replace(k, v)
+ if (prev != null) Some(prev) else None
+ }
+
+ def replace(k: A, oldvalue: B, newvalue: B): Boolean = underlying.replace(k, oldvalue, newvalue)
+
+ }
+
+ case class JPropertiesWrapper(underlying: ju.Properties)
+ extends mutable.Map[String, String] with mutable.MapLike[String, String, JPropertiesWrapper] {
+ override def size = underlying.size
+
+ def get(k : String) = {
+ val v = underlying.get(k)
+ if (v != null)
+ Some(v.asInstanceOf[String])
+ else
+ None
+ }
+
+ def +=(kv: (String, String)): this.type = { underlying.put(kv._1, kv._2); this }
+ def -=(key: String): this.type = { underlying.remove(key); this }
+
+ override def put(k : String, v : String): Option[String] = {
+ val r = underlying.put(k, v)
+ if (r != null) Some(r.asInstanceOf[String]) else None
+ }
+
+ override def update(k : String, v : String) { underlying.put(k, v) }
+
+ override def remove(k : String): Option[String] = {
+ val r = underlying.remove(k)
+ if (r != null) Some(r.asInstanceOf[String]) else None
+ }
+
+ def iterator = new Iterator[(String, String)] {
+ val ui = underlying.entrySet.iterator
+ def hasNext = ui.hasNext
+ def next = { val e = ui.next ; (e.getKey.asInstanceOf[String], e.getValue.asInstanceOf[String]) }
+ }
+
+ override def clear = underlying.clear
+
+ override def empty = JPropertiesWrapper(new ju.Properties)
+
+ def getProperty(key: String) = underlying.getProperty(key)
+
+ def getProperty(key: String, defaultValue: String) = underlying.getProperty(key, defaultValue)
+
+ def setProperty(key: String, value: String) = underlying.setProperty(key, value)
+ }
+
}
+
+
+
+
diff --git a/src/library/scala/collection/SeqViewLike.scala b/src/library/scala/collection/SeqViewLike.scala
index 6f677616e7..1a8cd20013 100644
--- a/src/library/scala/collection/SeqViewLike.scala
+++ b/src/library/scala/collection/SeqViewLike.scala
@@ -23,14 +23,13 @@ import TraversableView.NoBuilder
trait SeqViewLike[+A,
+Coll,
+This <: SeqView[A, Coll] with SeqViewLike[A, Coll, This]]
- extends Seq[A]
- with SeqLike[A, This]
- with IterableView[A, Coll]
- with IterableViewLike[A, Coll, This]
- with views.SeqTransformations[A, Coll, This]
+ extends Seq[A] with SeqLike[A, This] with IterableView[A, Coll] with IterableViewLike[A, Coll, This]
{ self =>
- trait Transformed[+B] extends views.SeqLike[B, Coll] with super.Transformed[B]
+ trait Transformed[+B] extends SeqView[B, Coll] with super.Transformed[B] {
+ override def length: Int
+ override def apply(idx: Int): B
+ }
trait Sliced extends Transformed[A] with super.Sliced {
override def length = ((until min self.length) - from) max 0
@@ -144,6 +143,21 @@ trait SeqViewLike[+A,
override def stringPrefix = self.stringPrefix+"P"
}
+ /** Boilerplate method, to override in each subclass
+ * This method could be eliminated if Scala had virtual classes
+ */
+ protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
+ protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
+ protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
+ protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
+ protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
+ protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
+ protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
+ protected override def newZipped[B](that: Iterable[B]): Transformed[(A, B)] = new Zipped[B] { val other = that }
+ protected override def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] = new ZippedAll[A1, B] { val other = that; val thisElem = _thisElem; val thatElem = _thatElem }
+ protected def newReversed: Transformed[A] = new Reversed { }
+ protected def newPatched[B >: A](_from: Int, _patch: Seq[B], _replaced: Int): Transformed[B] = new Patched[B] { val from = _from; val patch = _patch; val replaced = _replaced }
+
override def reverse: This = newReversed.asInstanceOf[This]
override def patch[B >: A, That](from: Int, patch: Seq[B], replaced: Int)(implicit bf: CanBuildFrom[This, B, That]): That = {
diff --git a/src/library/scala/collection/TraversableLike.scala b/src/library/scala/collection/TraversableLike.scala
index f570c4ca9a..fc666ddb92 100644
--- a/src/library/scala/collection/TraversableLike.scala
+++ b/src/library/scala/collection/TraversableLike.scala
@@ -292,13 +292,13 @@ self =>
* `pf` to each element on which it is defined and collecting the results.
* The order of the elements is preserved.
*
- * @usecase def partialMap[B](pf: A =>? B): $Coll[B]
+ * @usecase def partialMap[B](pf: PartialFunction[A, B]): $Coll[B]
*
* @return a new $coll resulting from applying the given partial function
* `pf` to each element on which it is defined and collecting the results.
* The order of the elements is preserved.
*/
- def partialMap[B, That](pf: A =>? B)(implicit bf: CanBuildFrom[Repr, B, That]): That = {
+ def partialMap[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[Repr, B, That]): That = {
val b = bf(repr)
for (x <- this) if (pf.isDefinedAt(x)) b += pf(x)
b.result
@@ -984,9 +984,7 @@ self =>
def toList: List[A] = (new ListBuffer[A] ++= thisCollection).toList
/** Converts this $coll to an iterable collection.
- *
- * Note: Will not terminate for infinite-sized collections.
- *
+ * $willNotTerminateInf
* @return an `Iterable` containing all elements of this $coll.
*/
def toIterable: Iterable[A] = toStream
@@ -1015,6 +1013,22 @@ self =>
*/
def toSet[B >: A]: immutable.Set[B] = immutable.Set() ++ thisCollection
+ /** Converts this $coll to a map. This method is unavailable unless
+ * the elements are members of Tuple2, each ((K, V)) becoming a key-value
+ * pair in the map. Duplicate keys will be overwritten by later keys:
+ * if this is an unordered collection, which key is in the resulting map
+ * is undefined.
+ * $willNotTerminateInf
+ * @return a map containing all elements of this $coll.
+ */
+ def toMap[T, U](implicit ev: A <:< (T, U)): immutable.Map[T, U] = {
+ val b = immutable.Map.newBuilder[T, U]
+ for (x <- this)
+ b += x
+
+ b.result
+ }
+
/** Displays all elements of this $coll in a string using start, end, and separator strings.
*
* @param start the starting string.
diff --git a/src/library/scala/collection/TraversableProxyLike.scala b/src/library/scala/collection/TraversableProxyLike.scala
index d8e3ed2a1b..24d6c7048d 100644
--- a/src/library/scala/collection/TraversableProxyLike.scala
+++ b/src/library/scala/collection/TraversableProxyLike.scala
@@ -36,7 +36,7 @@ trait TraversableProxyLike[+A, +This <: TraversableLike[A, This] with Traversabl
override def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[This, B, That]): That = self.++(that)(bf)
override def map[B, That](f: A => B)(implicit bf: CanBuildFrom[This, B, That]): That = self.map(f)(bf)
override def flatMap[B, That](f: A => Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That = self.flatMap(f)(bf)
- override def partialMap[B, That](pf: A =>? B)(implicit bf: CanBuildFrom[This, B, That]): That = self.partialMap(pf)(bf)
+ override def partialMap[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[This, B, That]): That = self.partialMap(pf)(bf)
override def filter(p: A => Boolean): This = self.filter(p)
override def filterNot(p: A => Boolean): This = self.filterNot(p)
override def partition(p: A => Boolean): (This, This) = self.partition(p)
diff --git a/src/library/scala/collection/TraversableViewLike.scala b/src/library/scala/collection/TraversableViewLike.scala
index 7f4d0ebd71..84c33296db 100644
--- a/src/library/scala/collection/TraversableViewLike.scala
+++ b/src/library/scala/collection/TraversableViewLike.scala
@@ -33,9 +33,7 @@ import TraversableView.NoBuilder
trait TraversableViewLike[+A,
+Coll,
+This <: TraversableView[A, Coll] with TraversableViewLike[A, Coll, This]]
- extends Traversable[A]
- with TraversableLike[A, This]
- with views.TraversableTransformations[A, Coll, This] {
+ extends Traversable[A] with TraversableLike[A, This] {
self =>
override protected[this] def newBuilder: Builder[A, This] =
@@ -43,16 +41,16 @@ self =>
protected def underlying: Coll
- trait Transformed[+B] extends views.TraversableLike[B, Coll] {
- lazy val underlying = self.underlying
- }
-
def force[B >: A, That](implicit bf: CanBuildFrom[Coll, B, That]) = {
val b = bf(underlying)
b ++= this
b.result()
}
+ trait Transformed[+B] extends TraversableView[B, Coll] {
+ lazy val underlying = self.underlying
+ }
+
/** pre: from >= 0
*/
trait Sliced extends Transformed[A] {
@@ -133,6 +131,17 @@ self =>
override def stringPrefix = self.stringPrefix+"D"
}
+ /** Boilerplate method, to override in each subclass
+ * This method could be eliminated if Scala had virtual classes
+ */
+ protected def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
+ protected def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
+ protected def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
+ protected def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
+ protected def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
+ protected def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
+ protected def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
+
override def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That = {
newAppended(that).asInstanceOf[That]
// was: if (bf.isInstanceOf[ByPassCanBuildFrom]) newAppended(that).asInstanceOf[That]
diff --git a/src/library/scala/collection/immutable/List.scala b/src/library/scala/collection/immutable/List.scala
index 2c36161003..2088f3ac78 100644
--- a/src/library/scala/collection/immutable/List.scala
+++ b/src/library/scala/collection/immutable/List.scala
@@ -98,7 +98,7 @@ sealed abstract class List[+A] extends LinearSeq[A]
/** Builds a new list by applying a function to all elements of this list.
* Like `xs map f`, but returns `xs` unchanged if function
- * `f` maps all elements to themselves (wrt ==).
+ * `f` maps all elements to themselves (wrt eq).
*
* Note: Unlike `map`, `mapConserve` is not tail-recursive.
*
@@ -106,15 +106,15 @@ sealed abstract class List[+A] extends LinearSeq[A]
* @tparam B the element type of the returned collection.
* @return a list resulting from applying the given function
* `f` to each element of this list and collecting the results.
- * @usecase def mapConserve[B](f: A => B): List[A]
+ * @usecase def mapConserve(f: A => A): List[A]
*/
- def mapConserve[B >: A] (f: A => B): List[B] = {
+ def mapConserve[B >: A <: AnyRef] (f: A => B): List[B] = {
def loop(ys: List[A]): List[B] =
if (ys.isEmpty) this
else {
val head0 = ys.head
val head1 = f(head0)
- if (head1 == head0) {
+ if (head1 eq head0.asInstanceOf[AnyRef]) {
loop(ys.tail)
} else {
val ys1 = head1 :: ys.tail.mapConserve(f)
diff --git a/src/library/scala/collection/immutable/RedBlack.scala b/src/library/scala/collection/immutable/RedBlack.scala
index 9fd082a7fd..dfb34552cd 100644
--- a/src/library/scala/collection/immutable/RedBlack.scala
+++ b/src/library/scala/collection/immutable/RedBlack.scala
@@ -33,7 +33,7 @@ abstract class RedBlack[A] {
def isBlack: Boolean
def lookup(x: A): Tree[B]
def update[B1 >: B](k: A, v: B1): Tree[B1] = blacken(upd(k, v))
- def delete(k: A): Tree[B] = del(k)
+ def delete(k: A): Tree[B] = blacken(del(k))
def foreach[U](f: (A, B) => U)
@deprecated("use `foreach' instead")
def visit[T](input: T)(f: (T, A, B) => (Boolean, T)): (Boolean, T)
@@ -80,16 +80,77 @@ abstract class RedBlack[A] {
else if (isSmaller(key, k)) balanceRight(isBlack, key, value, left, right.upd(k, v))
else mkTree(isBlack, k, v, left, right)
}
+ // Based on Stefan Kahrs' Haskell version of Okasaki's Red&Black Trees
+ // http://www.cse.unsw.edu.au/~dons/data/RedBlackTree.html
def del(k: A): Tree[B] = {
- if (isSmaller(k, key)) mkTree(isBlack, key, value, left.del(k), right)
- else if (isSmaller(key, k)) mkTree(isBlack, key, value, left, right.del(k))
- else if (left.isEmpty) right
- else if (right.isEmpty) left
- else {
- val s = right.smallest
- mkTree(isBlack, s.key, s.value, left, right.del(s.key))
+ def balance(x: A, xv: B, tl: Tree[B], tr: Tree[B]) = (tl, tr) match {
+ case (RedTree(y, yv, a, b), RedTree(z, zv, c, d)) =>
+ RedTree(x, xv, BlackTree(y, yv, a, b), BlackTree(z, zv, c, d))
+ case (RedTree(y, yv, RedTree(z, zv, a, b), c), d) =>
+ RedTree(y, yv, BlackTree(z, zv, a, b), BlackTree(x, xv, c, d))
+ case (RedTree(y, yv, a, RedTree(z, zv, b, c)), d) =>
+ RedTree(z, zv, BlackTree(y, yv, a, b), BlackTree(x, xv, c, d))
+ case (a, RedTree(y, yv, b, RedTree(z, zv, c, d))) =>
+ RedTree(y, yv, BlackTree(x, xv, a, b), BlackTree(z, zv, c, d))
+ case (a, RedTree(y, yv, RedTree(z, zv, b, c), d)) =>
+ RedTree(z, zv, BlackTree(x, xv, a, b), BlackTree(y, yv, c, d))
+ case (a, b) =>
+ BlackTree(x, xv, a, b)
+ }
+ def subl(t: Tree[B]) = t match {
+ case BlackTree(x, xv, a, b) => RedTree(x, xv, a, b)
+ case _ => error("Defect: invariance violation; expected black, got "+t)
+ }
+ def balLeft(x: A, xv: B, tl: Tree[B], tr: Tree[B]) = (tl, tr) match {
+ case (RedTree(y, yv, a, b), c) =>
+ RedTree(x, xv, BlackTree(y, yv, a, b), c)
+ case (bl, BlackTree(y, yv, a, b)) =>
+ balance(x, xv, bl, RedTree(y, yv, a, b))
+ case (bl, RedTree(y, yv, BlackTree(z, zv, a, b), c)) =>
+ RedTree(z, zv, BlackTree(x, xv, bl, a), balance(y, yv, b, subl(c)))
+ case _ => error("Defect: invariance violation at "+right)
+ }
+ def balRight(x: A, xv: B, tl: Tree[B], tr: Tree[B]) = (tl, tr) match {
+ case (a, RedTree(y, yv, b, c)) =>
+ RedTree(x, xv, a, BlackTree(y, yv, b, c))
+ case (BlackTree(y, yv, a, b), bl) =>
+ balance(x, xv, RedTree(y, yv, a, b), bl)
+ case (RedTree(y, yv, a, BlackTree(z, zv, b, c)), bl) =>
+ RedTree(z, zv, balance(y, yv, subl(a), b), BlackTree(x, xv, c, bl))
+ case _ => error("Defect: invariance violation at "+left)
+ }
+ def delLeft = left match {
+ case _: BlackTree[_] => balLeft(key, value, left.del(k), right)
+ case _ => RedTree(key, value, left.del(k), right)
+ }
+ def delRight = right match {
+ case _: BlackTree[_] => balRight(key, value, left, right.del(k))
+ case _ => RedTree(key, value, left, right.del(k))
+ }
+ def append(tl: Tree[B], tr: Tree[B]): Tree[B] = (tl, tr) match {
+ case (Empty, t) => t
+ case (t, Empty) => t
+ case (RedTree(x, xv, a, b), RedTree(y, yv, c, d)) =>
+ append(b, c) match {
+ case RedTree(z, zv, bb, cc) => RedTree(z, zv, RedTree(x, xv, a, bb), RedTree(y, yv, cc, d))
+ case bc => RedTree(x, xv, a, RedTree(y, yv, bc, d))
+ }
+ case (BlackTree(x, xv, a, b), BlackTree(y, yv, c, d)) =>
+ append(b, c) match {
+ case RedTree(z, zv, bb, cc) => RedTree(z, zv, BlackTree(x, xv, a, bb), BlackTree(y, yv, cc, d))
+ case bc => balLeft(x, xv, a, BlackTree(y, yv, bc, d))
+ }
+ case (a, RedTree(x, xv, b, c)) => RedTree(x, xv, append(a, b), c)
+ case (RedTree(x, xv, a, b), c) => RedTree(x, xv, a, append(b, c))
+ }
+ // RedBlack is neither A : Ordering[A], nor A <% Ordered[A]
+ k match {
+ case _ if isSmaller(k, key) => delLeft
+ case _ if isSmaller(key, k) => delRight
+ case _ => append(left, right)
}
}
+
def smallest: NonEmpty[B] = if (left.isEmpty) this else left.smallest
def toStream: Stream[(A,B)] =
diff --git a/src/library/scala/collection/immutable/StringOps.scala b/src/library/scala/collection/immutable/StringOps.scala
index 9138c2bbac..95509ab9d6 100644
--- a/src/library/scala/collection/immutable/StringOps.scala
+++ b/src/library/scala/collection/immutable/StringOps.scala
@@ -17,7 +17,7 @@ import mutable.StringBuilder
/**
* @since 2.8
*/
-class StringOps(override val repr: String) extends StringLike[String] {
+final class StringOps(override val repr: String) extends StringLike[String] {
override protected[this] def thisCollection: WrappedString = new WrappedString(repr)
override protected[this] def toCollection(repr: String): WrappedString = new WrappedString(repr)
@@ -25,5 +25,8 @@ class StringOps(override val repr: String) extends StringLike[String] {
/** Creates a string builder buffer as builder for this class */
override protected[this] def newBuilder = new StringBuilder
+ override def slice(from: Int, until: Int): String =
+ repr.substring(from max 0, until min repr.length)
+
override def toString = repr
}
diff --git a/src/library/scala/collection/immutable/WrappedString.scala b/src/library/scala/collection/immutable/WrappedString.scala
index e535cddfce..e10b3ab0ee 100644
--- a/src/library/scala/collection/immutable/WrappedString.scala
+++ b/src/library/scala/collection/immutable/WrappedString.scala
@@ -26,6 +26,9 @@ class WrappedString(override val self: String) extends IndexedSeq[Char] with Str
/** Creates a string builder buffer as builder for this class */
override protected[this] def newBuilder = WrappedString.newBuilder
+
+ override def slice(from: Int, until: Int): WrappedString =
+ new WrappedString(self.substring(from max 0, until min self.length))
}
/**
diff --git a/src/library/scala/collection/interfaces/TraversableMethods.scala b/src/library/scala/collection/interfaces/TraversableMethods.scala
index 4cf133d36a..08ade7586d 100644
--- a/src/library/scala/collection/interfaces/TraversableMethods.scala
+++ b/src/library/scala/collection/interfaces/TraversableMethods.scala
@@ -24,7 +24,7 @@ trait TraversableMethods[+A, +This <: TraversableLike[A, This] with Traversable[
// maps/iteration
def flatMap[B, That](f: A => Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That
def map[B, That](f: A => B)(implicit bf: CanBuildFrom[This, B, That]): That
- def partialMap[B, That](pf: A =>? B)(implicit bf: CanBuildFrom[This, B, That]): That
+ def partialMap[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[This, B, That]): That
// new collections
def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[This, B, That]): That
diff --git a/src/library/scala/collection/mutable/ConcurrentMap.scala b/src/library/scala/collection/mutable/ConcurrentMap.scala
new file mode 100644
index 0000000000..d09bf57e1b
--- /dev/null
+++ b/src/library/scala/collection/mutable/ConcurrentMap.scala
@@ -0,0 +1,79 @@
+package scala.collection.mutable
+
+
+
+
+
+
+/**
+ * A template trait for mutable maps that allow concurrent access.
+ * $concurrentmapinfo
+ *
+ * @tparam A the key type of the map
+ * @tparam B the value type of the map
+ *
+ * @define concurrentmapinfo
+ * This is a base trait for all Scala concurrent map implementations. It
+ * provides all of the methods a Map does, with the difference that all the
+ * changes are atomic. It also describes methods specific to concurrent maps.
+ * Note: The concurrent maps do not accept `null` for keys or values.
+ *
+ * @define atomicop
+ * This is done atomically.
+ *
+ * @since 2.8
+ */
+trait ConcurrentMap[A, B] extends Map[A, B] {
+
+ /**
+ * Associates the given key with a given value, unless the key was already associated with some other value.
+ * $atomicop
+ *
+ * @param k key with which the specified value is to be associated with
+ * @param v value to be associated with the specified key
+ * @return `Some(oldvalue)` if there was a value `oldvalue` previously associated with the
+ * specified key, or `None` if there was no mapping for the specified key
+ */
+ def putIfAbsent(k: A, v: B): Option[B]
+
+ /**
+ * Removes the entry for the specified key if its currently mapped to the specified value.
+ * $atomicop
+ *
+ * @param k key for which the entry should be removed
+ * @param v value expected to be associated with the specified key if the removal is to take place
+ * @return `true` if the removal took place, `false` otherwise
+ */
+ def remove(k: A, v: B): Boolean
+
+ /**
+ * Replaces the entry for the given key only if it was previously mapped to a given value.
+ * $atomicop
+ *
+ * @param k key for which the entry should be replaced
+ * @param oldvalue value expected to be associated with the specified key if replacing is to happen
+ * @param newvalue value to be associated with the specified key
+ * @return `true` if the entry was replaced, `false` otherwise
+ */
+ def replace(k: A, oldvalue: B, newvalue: B): Boolean
+
+ /**
+ * Replaces the entry for the given key only if it was previously mapped to some value.
+ * $atomicop
+ *
+ * @param k key for which the entry should be replaced
+ * @param v value to be associated with the specified key
+ * @return `Some(v)` if the given key was previously mapped to some value `v`, or `None` otherwise
+ */
+ def replace(k: A, v: B): Option[B]
+
+}
+
+
+
+
+
+
+
+
+
diff --git a/src/library/scala/collection/mutable/FlatHashTable.scala b/src/library/scala/collection/mutable/FlatHashTable.scala
index 422559f089..0e73bf7fad 100644
--- a/src/library/scala/collection/mutable/FlatHashTable.scala
+++ b/src/library/scala/collection/mutable/FlatHashTable.scala
@@ -184,7 +184,7 @@ trait FlatHashTable[A] {
private def checkConsistent() {
for (i <- 0 until table.length)
if (table(i) != null && !containsEntry(table(i).asInstanceOf[A]))
- assert(false, i+" "+table(i)+" "+table.toString)
+ assert(false, i+" "+table(i)+" "+table.mkString)
}
protected def elemHashCode(elem: A) = if (elem == null) 0 else elem.hashCode()
diff --git a/src/library/scala/collection/mutable/IndexedSeqView.scala b/src/library/scala/collection/mutable/IndexedSeqView.scala
index db1735b543..e864845455 100644
--- a/src/library/scala/collection/mutable/IndexedSeqView.scala
+++ b/src/library/scala/collection/mutable/IndexedSeqView.scala
@@ -13,6 +13,7 @@ package scala.collection
package mutable
import generic._
+
import TraversableView.NoBuilder
/** A non-strict view of a mutable IndexedSeq.
@@ -29,7 +30,9 @@ self =>
def update(idx: Int, elem: A)
- trait Transformed[B] extends views.MutableIndexedSeq[B, Coll] with IndexedSeqView[B, Coll] with super.Transformed[B]
+ trait Transformed[B] extends IndexedSeqView[B, Coll] with super.Transformed[B] {
+ def update(idx: Int, elem: B)
+ }
trait Sliced extends Transformed[A] with super.Sliced {
override def update(idx: Int, elem: A) =
diff --git a/src/library/scala/collection/mutable/PriorityQueue.scala b/src/library/scala/collection/mutable/PriorityQueue.scala
index 203f1eee15..c4dac9effb 100644
--- a/src/library/scala/collection/mutable/PriorityQueue.scala
+++ b/src/library/scala/collection/mutable/PriorityQueue.scala
@@ -35,7 +35,7 @@ class PriorityQueue[A](implicit ord: Ordering[A])
{
import ord._
- private class ResizableArrayAccess[A] extends ResizableArray[A] {
+ private final class ResizableArrayAccess[A] extends ResizableArray[A] {
@inline def p_size0 = size0
@inline def p_size0_=(s: Int) = size0 = s
@inline def p_array = array
diff --git a/src/library/scala/collection/views/Transformed.scala b/src/library/scala/collection/views/Transformed.scala
deleted file mode 100644
index 189ca127c8..0000000000
--- a/src/library/scala/collection/views/Transformed.scala
+++ /dev/null
@@ -1,128 +0,0 @@
-/* __ *\
-** ________ ___ / / ___ Scala API **
-** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
-** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
-** /____/\___/_/ |_/____/_/ | | **
-** |/ **
-\* */
-
-// $Id$
-
-
-package scala.collection
-package views
-
-import generic.CanBuildFrom
-
-/** These classes act as accumulators for the majority of methods in the
- * collections hierarchy. By creating abstract classes rather than using
- * the traits exclusively, we avoid creating forwarders in dozens of distinct
- * anonymous classes and reduce the size of scala-library.jar by over 200K.
- */
-private[collection] trait Transformed
-private[collection] abstract class TraversableLike[+B, +Coll] extends TraversableView[B, Coll] with Transformed {
- override def foreach[C](f: B => C): Unit
-}
-private[collection] abstract class IterableLike[+B, +Coll] extends TraversableLike[B, Coll] with IterableView[B, Coll] {
- override def iterator: Iterator[B]
-}
-private[collection] abstract class SeqLike[+B, +Coll] extends IterableLike[B, Coll] with SeqView[B, Coll] {
- override def length: Int
- override def apply(idx: Int): B
-}
-private[collection] abstract class IndexedSeqLike[+B, +Coll] extends SeqLike[B, Coll] with IndexedSeqView[B, Coll] {
- /** Override to use IndexedSeq's foreach; todo: see whether this is really faster */
- override def foreach[U](f: B => U) = super[IndexedSeqView].foreach(f)
-}
-private[collection] abstract class MutableIndexedSeq[B, +Coll] extends IndexedSeqLike[B, Coll] {
- def update(idx: Int, elem: B)
-}
-
-/** The boilerplate in the following traits factored out of the *ViewLike classes
- * to reduce noise. It exists only to specialize the return type of each method.
- * It would be unnecessary if scala had virtual classes because the inner classes
- * of subtraits would subclass the parent trait inner classes, and the same method
- * would then suffice for both.
- */
-private[collection] trait TraversableTransformations[+A, +Coll, +This <: TraversableView[A, Coll] with TraversableViewLike[A, Coll, This]] {
- self: TraversableViewLike[A, Coll, This] =>
-
- /** Boilerplate methods, to override in each subclass. */
- protected def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
- protected def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
- protected def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
- protected def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
- protected def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
- protected def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
- protected def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
-}
-
-private[collection] trait IterableTransformations[+A, +Coll, +This <: IterableView[A, Coll] with IterableViewLike[A, Coll, This]]
- extends TraversableTransformations[A, Coll, This]
-{
- self: IterableViewLike[A, Coll, This] =>
-
- /** Inherited from TraversableView */
- protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
- protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
- protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
- protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
- protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
- protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
- protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
-
- /** IterableView boilerplate contribution */
- protected def newZipped[B](that: Iterable[B]): Transformed[(A, B)] = new Zipped[B] {
- val other = that
- }
- protected def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] = new ZippedAll[A1, B] {
- val other: Iterable[B] = that
- val thisElem = _thisElem
- val thatElem = _thatElem
- }
-}
-
-private[collection] trait SeqTransformations[+A, +Coll, +This <: SeqView[A, Coll] with SeqViewLike[A, Coll, This]]
- extends IterableTransformations[A, Coll, This]
-{
- self: SeqViewLike[A, Coll, This] =>
-
- /** Inherited from IterableView */
- protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
- protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
- protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
- protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
- protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
- protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
- protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
- protected override def newZipped[B](that: Iterable[B]): Transformed[(A, B)] = new Zipped[B] { val other = that }
- protected override def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] =
- new ZippedAll[A1, B] { val other = that; val thisElem = _thisElem; val thatElem = _thatElem }
-
- /** SeqView boilerplate contribution */
- protected def newReversed: Transformed[A] = new Reversed { }
- protected def newPatched[B >: A](_from: Int, _patch: Seq[B], _replaced: Int): Transformed[B] =
- new Patched[B] { val from = _from; val patch = _patch; val replaced = _replaced }
-}
-
-private[collection] trait IndexedSeqTransformations[+A, +Coll, +This <: IndexedSeqView[A, Coll] with IndexedSeqViewLike[A, Coll, This]]
- extends SeqTransformations[A, Coll, This]
-{
- self: IndexedSeqViewLike[A, Coll, This] =>
-
- /** Inherited from SeqView */
- protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
- protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
- protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
- protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
- protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
- protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
- protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
-
- protected override def newZipped[B](that: Iterable[B]): Transformed[(A, B)] = new Zipped[B] { val other = that }
- protected override def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] =
- new ZippedAll[A1, B] { val other = that; val thisElem = _thisElem; val thatElem = _thatElem }
- protected override def newReversed: Transformed[A] = new Reversed { }
- protected override def newPatched[B >: A](_from: Int, _patch: Seq[B], _replaced: Int): Transformed[B] =
- new Patched[B] { val from = _from; val patch = _patch; val replaced = _replaced }
-}
diff --git a/src/library/scala/concurrent/MailBox.scala b/src/library/scala/concurrent/MailBox.scala
index 3b00d6165d..c23bbf1c80 100644
--- a/src/library/scala/concurrent/MailBox.scala
+++ b/src/library/scala/concurrent/MailBox.scala
@@ -26,7 +26,7 @@ class MailBox extends AnyRef with ListQueueCreator {
def isDefinedAt(msg: Message): Boolean
}
- private class Receiver[A](receiver: Message =>? A) extends PreReceiver {
+ private class Receiver[A](receiver: PartialFunction[Message, A]) extends PreReceiver {
def isDefinedAt(msg: Message) = receiver.isDefinedAt(msg)
@@ -85,7 +85,7 @@ class MailBox extends AnyRef with ListQueueCreator {
* Block until there is a message in the mailbox for which the processor
* <code>f</code> is defined.
*/
- def receive[A](f: Message =>? A): A = {
+ def receive[A](f: PartialFunction[Message, A]): A = {
val r = new Receiver(f)
scanSentMsgs(r)
r.receive()
@@ -95,7 +95,7 @@ class MailBox extends AnyRef with ListQueueCreator {
* Block until there is a message in the mailbox for which the processor
* <code>f</code> is defined or the timeout is over.
*/
- def receiveWithin[A](msec: Long)(f: Message =>? A): A = {
+ def receiveWithin[A](msec: Long)(f: PartialFunction[Message, A]): A = {
val r = new Receiver(f)
scanSentMsgs(r)
r.receiveWithin(msec)
diff --git a/src/library/scala/io/Source.scala b/src/library/scala/io/Source.scala
index e5cf73ff44..e88bfd0bf1 100644
--- a/src/library/scala/io/Source.scala
+++ b/src/library/scala/io/Source.scala
@@ -313,7 +313,7 @@ abstract class Source extends Iterator[Char]
}
/** The close() method closes the underlying resource. */
- def close: Unit =
+ def close(): Unit =
if (closeFunction != null) closeFunction()
/** The reset() method creates a fresh copy of this Source. */
diff --git a/src/library/scala/math/BigDecimal.scala b/src/library/scala/math/BigDecimal.scala
index 2f3c7f131b..6bd6b33484 100644
--- a/src/library/scala/math/BigDecimal.scala
+++ b/src/library/scala/math/BigDecimal.scala
@@ -100,7 +100,7 @@ object BigDecimal
*/
def apply(x: Array[Char]): BigDecimal = apply(x, defaultMathContext)
def apply(x: Array[Char], mc: MathContext): BigDecimal =
- new BigDecimal(new BigDec(x.toString, mc), mc)
+ new BigDecimal(new BigDec(x.mkString, mc), mc)
/** Translates the decimal String representation of a <code>BigDecimal</code>
* into a <code>BigDecimal</code>.
diff --git a/src/library/scala/package.scala b/src/library/scala/package.scala
index bc5b5d36f2..9fa09e3b72 100644
--- a/src/library/scala/package.scala
+++ b/src/library/scala/package.scala
@@ -64,8 +64,6 @@ package object scala {
type Range = scala.collection.immutable.Range
val Range = scala.collection.immutable.Range
- type =>? [-A, +B] = PartialFunction[A, B]
-
// Migrated from Predef
val $scope = scala.xml.TopScope
diff --git a/src/library/scala/runtime/ScalaRunTime.scala b/src/library/scala/runtime/ScalaRunTime.scala
index ebce675347..ecc81c074e 100644
--- a/src/library/scala/runtime/ScalaRunTime.scala
+++ b/src/library/scala/runtime/ScalaRunTime.scala
@@ -100,7 +100,7 @@ object ScalaRunTime {
if (x == null) throw new UninitializedError else x
abstract class Try[+A] {
- def Catch[B >: A](handler: Throwable =>? B): B
+ def Catch[B >: A](handler: PartialFunction[Throwable, B]): B
def Finally(fin: => Unit): A
}
@@ -115,7 +115,7 @@ object ScalaRunTime {
def run() { result = block }
- def Catch[B >: A](handler: Throwable =>? B): B =
+ def Catch[B >: A](handler: PartialFunction[Throwable, B]): B =
if (exception == null) result
else if (handler isDefinedAt exception) handler(exception)
else throw exception
diff --git a/src/library/scala/util/control/Exception.scala b/src/library/scala/util/control/Exception.scala
index 67f9ec183b..356b11df51 100644
--- a/src/library/scala/util/control/Exception.scala
+++ b/src/library/scala/util/control/Exception.scala
@@ -23,14 +23,14 @@ object Exception
// We get lots of crashes using this, so for now we just use Class[_]
// type ExClass = Class[_ <: Throwable]
- type Catcher[+T] = Throwable =>? T
- type ExceptionCatcher[+T] = Exception =>? T
+ type Catcher[+T] = PartialFunction[Throwable, T]
+ type ExceptionCatcher[+T] = PartialFunction[Exception, T]
// due to the magic of contravariance, Throwable => T is a subtype of
// Exception => T, not the other way around. So we manually construct
// a Throwable => T and simply rethrow the non-Exceptions.
implicit def fromExceptionCatcher[T](pf: ExceptionCatcher[T]): Catcher[T] = {
- new (Throwable =>? T) {
+ new PartialFunction[Throwable, T] {
def isDefinedAt(x: Throwable) = x match {
case e: Exception if pf.isDefinedAt(e) => true
case _ => false
@@ -101,7 +101,7 @@ object Exception
/** Create a new Catch with the same isDefinedAt logic as this one,
* but with the supplied apply method replacing the current one. */
def withApply[U](f: (Throwable) => U): Catch[U] = {
- val pf2 = new (Throwable =>? U) {
+ val pf2 = new PartialFunction[Throwable, U] {
def isDefinedAt(x: Throwable) = pf isDefinedAt x
def apply(x: Throwable) = f(x)
}
@@ -139,8 +139,8 @@ object Exception
override def toString() = List("Try(<body>)", catcher.toString) mkString " "
}
- final val nothingCatcher: Throwable =>? Nothing =
- new (Throwable =>? Nothing) {
+ final val nothingCatcher: PartialFunction[Throwable, Nothing] =
+ new PartialFunction[Throwable, Nothing] {
def isDefinedAt(x: Throwable) = false
def apply(x: Throwable) = throw x
}
@@ -207,7 +207,7 @@ object Exception
classes exists (_ isAssignableFrom x.getClass)
private def pfFromExceptions(exceptions: Class[_]*) =
- new (Throwable =>? Nothing) {
+ new PartialFunction[Throwable, Nothing] {
def apply(x: Throwable) = throw x
def isDefinedAt(x: Throwable) = wouldMatch(x, exceptions)
}
diff --git a/src/library/scala/util/parsing/combinator/Parsers.scala b/src/library/scala/util/parsing/combinator/Parsers.scala
index 1205d2f911..3aa7cc7de1 100644
--- a/src/library/scala/util/parsing/combinator/Parsers.scala
+++ b/src/library/scala/util/parsing/combinator/Parsers.scala
@@ -93,7 +93,7 @@ trait Parsers {
* `f' applied to the result of this `ParseResult', packaged up as a new `ParseResult'.
* If `f' is not defined, `Failure'.
*/
- def mapPartial[U](f: T =>? U, error: T => String): ParseResult[U]
+ def mapPartial[U](f: PartialFunction[T, U], error: T => String): ParseResult[U]
def flatMapWithNext[U](f: T => Input => ParseResult[U]): ParseResult[U]
@@ -119,7 +119,7 @@ trait Parsers {
*/
case class Success[+T](result: T, override val next: Input) extends ParseResult[T] {
def map[U](f: T => U) = Success(f(result), next)
- def mapPartial[U](f: T =>? U, error: T => String): ParseResult[U]
+ def mapPartial[U](f: PartialFunction[T, U], error: T => String): ParseResult[U]
= if(f.isDefinedAt(result)) Success(f(result), next)
else Failure(error(result), next)
@@ -146,7 +146,7 @@ trait Parsers {
lastNoSuccess = this
def map[U](f: Nothing => U) = this
- def mapPartial[U](f: Nothing =>? U, error: Nothing => String): ParseResult[U] = this
+ def mapPartial[U](f: PartialFunction[Nothing, U], error: Nothing => String): ParseResult[U] = this
def flatMapWithNext[U](f: Nothing => Input => ParseResult[U]): ParseResult[U]
= this
@@ -345,7 +345,7 @@ trait Parsers {
* @return a parser that succeeds if the current parser succeeds <i>and</i> `f' is applicable
* to the result. If so, the result will be transformed by `f'.
*/
- def ^? [U](f: T =>? U, error: T => String): Parser[U] = Parser{ in =>
+ def ^? [U](f: PartialFunction[T, U], error: T => String): Parser[U] = Parser{ in =>
this(in).mapPartial(f, error)}.named(toString+"^?")
/** A parser combinator for partial function application
@@ -358,7 +358,7 @@ trait Parsers {
* @return a parser that succeeds if the current parser succeeds <i>and</i> `f' is applicable
* to the result. If so, the result will be transformed by `f'.
*/
- def ^? [U](f: T =>? U): Parser[U] = ^?(f, r => "Constructor function not defined at "+r)
+ def ^? [U](f: PartialFunction[T, U]): Parser[U] = ^?(f, r => "Constructor function not defined at "+r)
/** A parser combinator that parameterises a subsequent parser with the result of this one
@@ -495,7 +495,7 @@ trait Parsers {
* @return A parser that succeeds if `f' is applicable to the first element of the input,
* applying `f' to it to produce the result.
*/
- def accept[U](expected: String, f: Elem =>? U): Parser[U] = acceptMatch(expected, f)
+ def accept[U](expected: String, f: PartialFunction[Elem, U]): Parser[U] = acceptMatch(expected, f)
def acceptIf(p: Elem => Boolean)(err: Elem => String): Parser[Elem] = Parser { in =>
@@ -503,7 +503,7 @@ trait Parsers {
else Failure(err(in.first), in)
}
- def acceptMatch[U](expected: String, f: Elem =>? U): Parser[U] = Parser{ in =>
+ def acceptMatch[U](expected: String, f: PartialFunction[Elem, U]): Parser[U] = Parser{ in =>
if (f.isDefinedAt(in.first)) Success(f(in.first), in.rest)
else Failure(expected+" expected", in)
}
diff --git a/src/library/scala/xml/Text.scala b/src/library/scala/xml/Text.scala
index 84e4cbe78f..3090883bb8 100644
--- a/src/library/scala/xml/Text.scala
+++ b/src/library/scala/xml/Text.scala
@@ -13,16 +13,22 @@ package scala.xml
import collection.mutable.StringBuilder
-object Text {
- def apply(data: String) =
- if (data != null) new Text(data)
- else throw new IllegalArgumentException("tried to construct Text with null")
-
- def unapply(other: Any) = other match {
- case x: Text => Some(x.data)
- case _ => None
- }
-}
+// XXX This attempt to make Text not a case class revealed a bug in the pattern
+// matcher (see ticket #2883) so I've put the case back. (It was/is desirable that
+// it not be a case class because it is using the antipattern of passing constructor
+// parameters to the superclass where they become vals, but since they will also be
+// vals in the subclass, it acquires an underscore to avoid a name clash.)
+//
+// object Text {
+// def apply(data: String) =
+// if (data != null) new Text(data)
+// else throw new IllegalArgumentException("tried to construct Text with null")
+//
+// def unapply(other: Any): Option[String] = other match {
+// case x: Text => Some(x.data)
+// case _ => None
+// }
+// }
/** The class <code>Text</code> implements an XML node for text (PCDATA).
* It is used in both non-bound and bound XML representations.
@@ -31,9 +37,9 @@ object Text {
*
* @param text the text contained in this node, may not be null.
*/
-class Text(data: String) extends Atom[String](data)
+case class Text(_data: String) extends Atom[String](_data)
{
- if (data == null)
+ if (_data == null)
throw new IllegalArgumentException("tried to construct Text with null")
/** XXX More hashCode flailing. */
diff --git a/src/library/scala/xml/Utility.scala b/src/library/scala/xml/Utility.scala
index 78c0ee9475..1cfe9c79c9 100644
--- a/src/library/scala/xml/Utility.scala
+++ b/src/library/scala/xml/Utility.scala
@@ -289,9 +289,10 @@ object Utility extends AnyRef with parsing.TokenTests
*/
def getName(s: String, index: Int): String = {
if (index >= s.length) null
- else (s drop index) match {
- case Seq(x, xs @ _*) if isNameStart(x) => x.toString + (xs takeWhile isNameChar).mkString
- case _ => ""
+ else {
+ val xs = s drop index
+ if (xs.nonEmpty && isNameStart(xs.head)) xs takeWhile isNameChar
+ else ""
}
}
diff --git a/src/library/scala/xml/parsing/FactoryAdapter.scala b/src/library/scala/xml/parsing/FactoryAdapter.scala
index 2385f645b5..a83f9677a1 100644
--- a/src/library/scala/xml/parsing/FactoryAdapter.scala
+++ b/src/library/scala/xml/parsing/FactoryAdapter.scala
@@ -135,7 +135,9 @@ abstract class FactoryAdapter extends DefaultHandler with factory.XMLLoader[Node
hStack push null
var m: MetaData = Null
- var scpe: NamespaceBinding = scopeStack.top
+ var scpe: NamespaceBinding =
+ if (scopeStack.isEmpty) TopScope
+ else scopeStack.top
for (i <- 0 until attributes.getLength()) {
val qname = attributes getQName i
diff --git a/src/library/scala/xml/parsing/MarkupParser.scala b/src/library/scala/xml/parsing/MarkupParser.scala
index 2779fe1d7c..a15cd0f7e4 100644
--- a/src/library/scala/xml/parsing/MarkupParser.scala
+++ b/src/library/scala/xml/parsing/MarkupParser.scala
@@ -32,6 +32,7 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
self: MarkupParser with MarkupHandler =>
type PositionType = Int
+ type InputType = Source
def xHandleError(that: Char, msg: String) = reportSyntaxError(msg)
@@ -47,6 +48,15 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
//
var curInput: Source = input
+ def lookahead(): BufferedIterator[Char] = new BufferedIterator[Char] {
+ val stream = curInput.toStream
+ curInput = Source.fromIterable(stream)
+ val underlying = Source.fromIterable(stream).buffered
+
+ def hasNext = underlying.hasNext
+ def next = underlying.next
+ def head = underlying.head
+ }
/** the handler of the markup, returns this */
private val handle: MarkupHandler = this
@@ -57,7 +67,6 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
/** holds the position in the source file */
var pos: Int = _
-
/* used when reading external subset */
var extIndex = -1
@@ -379,20 +388,8 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
*/
def xCharData: NodeSeq = {
xToken("[CDATA[")
- val pos1 = pos
- val sb: StringBuilder = new StringBuilder()
- while (true) {
- if (ch==']' &&
- { sb.append(ch); nextch; ch == ']' } &&
- { sb.append(ch); nextch; ch == '>' } ) {
- sb.setLength(sb.length - 2);
- nextch;
- return PCData(sb.toString)
- } else sb.append( ch );
- nextch;
- }
- // bq: (todo) increase grace when meeting CDATA section
- throw FatalError("this cannot happen");
+ def mkResult(pos: Int, s: String): NodeSeq = PCData(s)
+ xTakeUntil(mkResult, () => pos, "]]>")
}
/** CharRef ::= "&amp;#" '0'..'9' {'0'..'9'} ";"
diff --git a/src/library/scala/xml/parsing/MarkupParserCommon.scala b/src/library/scala/xml/parsing/MarkupParserCommon.scala
index c4ba2ccf15..57c46c4685 100644
--- a/src/library/scala/xml/parsing/MarkupParserCommon.scala
+++ b/src/library/scala/xml/parsing/MarkupParserCommon.scala
@@ -18,9 +18,16 @@ import Utility.Escapes.{ pairs => unescape }
* All members should be accessed through those.
*/
private[scala] trait MarkupParserCommon extends TokenTests {
- // type InputType // Source, CharArrayReader
+ private final val SU: Char = 0x1A
+ protected def unreachable = Predef.error("Cannot be reached.")
+
// type HandleType // MarkupHandler, SymbolicXMLBuilder
- // type PositionType // Int, Position
+
+ type InputType // Source, CharArrayReader
+ type PositionType // Int, Position
+
+ /** Create a lookahead reader which does not influence the input */
+ def lookahead(): BufferedIterator[Char]
def ch: Char
def nextch: Char
@@ -48,4 +55,41 @@ private[scala] trait MarkupParserCommon extends TokenTests {
//
def returning[T](x: T)(f: T => Unit): T = { f(x) ; x }
+
+ /** Take characters from input stream until given String "until"
+ * is seen. Once seen, the accumulated characters are passed
+ * along with the current Position to the supplied handler function.
+ */
+ protected def xTakeUntil[T](
+ handler: (PositionType, String) => T,
+ positioner: () => PositionType,
+ until: String): T =
+ {
+ val sb = new StringBuilder
+ val head = until charAt 0
+ val rest = until drop 1
+
+ while (true) {
+ if (ch == head && peek(rest))
+ return handler(positioner(), sb.toString)
+ else if (ch == SU)
+ xHandleError(ch, "") // throws TruncatedXML in compiler
+
+ sb append ch
+ nextch
+ }
+ unreachable
+ }
+
+ /** Create a non-destructive lookahead reader and see if the head
+ * of the input would match the given String. If yes, return true
+ * and drop the entire String from input; if no, return false
+ * and leave input unchanged.
+ */
+ private def peek(lookingFor: String): Boolean =
+ (lookahead() take lookingFor.length sameElements lookingFor.iterator) && {
+ // drop the chars from the real reader (all lookahead + orig)
+ (0 to lookingFor.length) foreach (_ => nextch)
+ true
+ }
}
diff --git a/src/scalap/scala/tools/scalap/Main.scala b/src/scalap/scala/tools/scalap/Main.scala
index 59c77813dd..59c46df25f 100644
--- a/src/scalap/scala/tools/scalap/Main.scala
+++ b/src/scalap/scala/tools/scalap/Main.scala
@@ -284,8 +284,8 @@ object Main {
* The short name of the package (without prefix)
*/
def name: String = ""
- def classes: List[ClassRep[AbstractFile]] = Nil
- def packages: List[ClassPath[AbstractFile]] = Nil
- def sourcepaths: List[AbstractFile] = Nil
+ val classes: List[ClassRep[AbstractFile]] = Nil
+ val packages: List[ClassPath[AbstractFile]] = Nil
+ val sourcepaths: List[AbstractFile] = Nil
}
}
diff --git a/src/scalap/scala/tools/scalap/scalax/rules/Rule.scala b/src/scalap/scala/tools/scalap/scalax/rules/Rule.scala
index f65c688aa7..1500b81050 100644
--- a/src/scalap/scala/tools/scalap/scalax/rules/Rule.scala
+++ b/src/scalap/scala/tools/scalap/scalax/rules/Rule.scala
@@ -56,9 +56,9 @@ trait Rule[-In, +Out, +A, +X] extends (In => Result[Out, A, X]) {
def ^^[B](fa2b : A => B) = map(fa2b)
- def ^^?[B](pf : A =>? B) = filter (pf.isDefinedAt(_)) ^^ pf
+ def ^^?[B](pf : PartialFunction[A, B]) = filter (pf.isDefinedAt(_)) ^^ pf
- def ??(pf : A =>? Any) = filter (pf.isDefinedAt(_))
+ def ??(pf : PartialFunction[A, Any]) = filter (pf.isDefinedAt(_))
def -^[B](b : B) = map { any => b }
@@ -73,7 +73,7 @@ trait Rule[-In, +Out, +A, +X] extends (In => Result[Out, A, X]) {
def >->[Out2, B, X2 >: X](fa2resultb : A => Result[Out2, B, X2]) = flatMap { a => any => fa2resultb(a) }
- def >>?[Out2, B, X2 >: X](pf : A =>? Rule[Out, Out2, B, X2]) = filter(pf isDefinedAt _) flatMap pf
+ def >>?[Out2, B, X2 >: X](pf : PartialFunction[A, Rule[Out, Out2, B, X2]]) = filter(pf isDefinedAt _) flatMap pf
def >>&[B, X2 >: X](fa2ruleb : A => Out => Result[Any, B, X2]) = flatMap { a => out => fa2ruleb(a)(out) mapOut { any => out } }
diff --git a/src/swing/scala/swing/Reactions.scala b/src/swing/scala/swing/Reactions.scala
index a2327d7b18..14d4deb981 100644
--- a/src/swing/scala/swing/Reactions.scala
+++ b/src/swing/scala/swing/Reactions.scala
@@ -27,7 +27,7 @@ object Reactions {
}
}
- type Reaction = Event =>? Unit
+ type Reaction = PartialFunction[Event, Unit]
/**
* A Reaction implementing this trait is strongly referenced in the reaction list
diff --git a/test/files/neg/t2421b.check b/test/files/neg/t2421b.check
new file mode 100644
index 0000000000..f666a7d9d7
--- /dev/null
+++ b/test/files/neg/t2421b.check
@@ -0,0 +1,4 @@
+t2421b.scala:12: error: could not find implicit value for parameter aa: Test.F[Test.A]
+ f
+ ^
+one error found \ No newline at end of file
diff --git a/test/files/neg/t2421b.scala b/test/files/neg/t2421b.scala
new file mode 100644
index 0000000000..d8159a8c37
--- /dev/null
+++ b/test/files/neg/t2421b.scala
@@ -0,0 +1,17 @@
+object Test {
+ class A
+ class B
+ class C
+ class F[X]
+
+ def f(implicit aa: F[A]) = println(aa)
+
+ // implicit def a : F[A] = new F[A]()
+ implicit def b[X <: B] = new F[X]()
+
+ f
+}
+
+/* bug:
+error: type arguments [Test2.A] do not conform to method b's type parameter bounds [X <: Test2.B]
+*/ \ No newline at end of file
diff --git a/test/files/neg/t2641.check b/test/files/neg/t2641.check
index 07900d0796..771624e8d9 100644
--- a/test/files/neg/t2641.check
+++ b/test/files/neg/t2641.check
@@ -25,7 +25,7 @@ t2641.scala:27: error: illegal inheritance; superclass Any
trait Sliced extends Transformed[A] with super.Sliced {
^
t2641.scala:27: error: illegal inheritance; superclass Any
- is not a subclass of the superclass TraversableLike
+ is not a subclass of the superclass Object
of the mixin trait Sliced
trait Sliced extends Transformed[A] with super.Sliced {
^
diff --git a/test/files/neg/t2870.check b/test/files/neg/t2870.check
new file mode 100644
index 0000000000..6577577d3f
--- /dev/null
+++ b/test/files/neg/t2870.check
@@ -0,0 +1,7 @@
+t2870.scala:1: error: not found: type Jar
+class Jars(jar: Jar)
+ ^
+t2870.scala:6: error: illegal cyclic reference involving value <import>
+ val scala = fromClasspathString(javaClassPath)
+ ^
+two errors found
diff --git a/test/files/neg/t2870.scala b/test/files/neg/t2870.scala
new file mode 100755
index 0000000000..4de19242e3
--- /dev/null
+++ b/test/files/neg/t2870.scala
@@ -0,0 +1,9 @@
+class Jars(jar: Jar)
+
+object Jars {
+ import scala.util.Properties.javaClassPath
+
+ val scala = fromClasspathString(javaClassPath)
+
+ def fromClasspathString(s: String): Jars = null
+}
diff --git a/test/files/pos/t2421b.scala b/test/files/pos/t2421b.scala
new file mode 100644
index 0000000000..0df3461662
--- /dev/null
+++ b/test/files/pos/t2421b.scala
@@ -0,0 +1,19 @@
+object Test {
+ class A
+ class B
+ class C
+ class F[X]
+
+ def f(implicit aa: F[A]) = println(aa)
+
+ implicit def a : F[A] = new F[A]()
+ implicit def b[X <: B] = new F[X]()
+
+ f
+}
+/* bug:
+error: ambiguous implicit values:
+ both method b in object Test1 of type [X <: Test1.B]Test1.F[X]
+ and method a in object Test1 of type => Test1.F[Test1.A]
+ match expected type Test1.F[Test1.A]
+*/
diff --git a/test/files/pos/t2810.scala b/test/files/pos/t2810.scala
new file mode 100644
index 0000000000..c85eca164a
--- /dev/null
+++ b/test/files/pos/t2810.scala
@@ -0,0 +1,8 @@
+
+
+
+
+object Test {
+ val closeable1: { def close(): Unit } = new scala.io.Source { val iter: Iterator[Char] = "".iterator }
+ val closeable2: { def close(): Unit } = new java.io.Closeable { def close() = {} }
+}
diff --git a/test/files/pos/t2867.scala b/test/files/pos/t2867.scala
new file mode 100644
index 0000000000..0434a380b9
--- /dev/null
+++ b/test/files/pos/t2867.scala
@@ -0,0 +1 @@
+case class A(l: List[_]*)
diff --git a/test/files/positions/Scaladoc6.scala b/test/files/positions/Scaladoc6.scala
new file mode 100644
index 0000000000..8beda625ae
--- /dev/null
+++ b/test/files/positions/Scaladoc6.scala
@@ -0,0 +1,10 @@
+object Scaladoc6 {
+ {
+ /**
+ * Foo
+ */
+ val i = 23
+ }
+
+ def f {}
+}
diff --git a/test/files/positions/Scaladoc7.scala b/test/files/positions/Scaladoc7.scala
new file mode 100644
index 0000000000..6175222e3f
--- /dev/null
+++ b/test/files/positions/Scaladoc7.scala
@@ -0,0 +1,6 @@
+object Scaladoc7 {
+ /**
+ * Foo
+ */
+ val Pair(i, j) = (1, 2)
+}
diff --git a/test/files/positions/Scaladoc8.scala b/test/files/positions/Scaladoc8.scala
new file mode 100644
index 0000000000..519d6ca06c
--- /dev/null
+++ b/test/files/positions/Scaladoc8.scala
@@ -0,0 +1,6 @@
+/**
+ * Foo
+ */
+object Scaladoc8 {
+
+}
diff --git a/test/files/run/bug2354.scala b/test/files/run/bug2354.scala
new file mode 100644
index 0000000000..f46db13a95
--- /dev/null
+++ b/test/files/run/bug2354.scala
@@ -0,0 +1,17 @@
+import scala.xml.parsing._
+import scala.io.Source
+
+object Test
+{
+ val xml_good = "<title><![CDATA[Hello [tag]]]></title>"
+ val xml_bad = "<title><![CDATA[Hello [tag] ]]></title>"
+
+ val parser1 = ConstructingParser.fromSource(Source.fromString(xml_good),false)
+ val parser2 = ConstructingParser.fromSource(Source.fromString(xml_bad),false)
+
+ def main(args: Array[String]): Unit = {
+ parser1.document
+ parser2.document
+ }
+}
+
diff --git a/test/files/run/bug2876.scala b/test/files/run/bug2876.scala
new file mode 100644
index 0000000000..f71879ebff
--- /dev/null
+++ b/test/files/run/bug2876.scala
@@ -0,0 +1,7 @@
+object Test
+{
+ def main(args: Array[String]): Unit = {
+ "x".view.filter(_ => true).take(1)
+ }
+}
+
diff --git a/test/files/run/map_java_conversions.scala b/test/files/run/map_java_conversions.scala
new file mode 100644
index 0000000000..4f9f8a915a
--- /dev/null
+++ b/test/files/run/map_java_conversions.scala
@@ -0,0 +1,60 @@
+
+
+
+
+
+object Test {
+
+ def main(args: Array[String]) {
+ import collection.JavaConversions._
+
+ test(new java.util.HashMap[String, String])
+ test(new java.util.Properties)
+ testConcMap
+ }
+
+ def testConcMap {
+ import collection.JavaConversions._
+
+ val concMap = new java.util.concurrent.ConcurrentHashMap[String, String]
+
+ test(concMap)
+ val cmap = asConcurrentMap(concMap)
+ cmap.putIfAbsent("absentKey", "absentValue")
+ cmap.put("somekey", "somevalue")
+ assert(cmap.remove("somekey", "somevalue") == true)
+ assert(cmap.replace("absentKey", "newAbsentValue") == Some("absentValue"))
+ assert(cmap.replace("absentKey", "newAbsentValue", ".......") == true)
+ }
+
+ def test(m: collection.mutable.Map[String, String]) {
+ m.clear
+ assert(m.size == 0)
+
+ m.put("key", "value")
+ assert(m.size == 1)
+
+ assert(m.put("key", "anotherValue") == Some("value"))
+ assert(m.put("key2", "value2") == None)
+ assert(m.size == 2)
+
+ m += (("key3", "value3"))
+ assert(m.size == 3)
+
+ m -= "key2"
+ assert(m.size == 2)
+ assert(m.nonEmpty)
+ assert(m.remove("key") == Some("anotherValue"))
+
+ m.clear
+ for (i <- 0 until 10) m += (("key" + i, "value" + i))
+ for ((k, v) <- m) assert(k.startsWith("key"))
+ }
+
+}
+
+
+
+
+
+
diff --git a/test/files/run/t2849.scala b/test/files/run/t2849.scala
new file mode 100644
index 0000000000..68094de736
--- /dev/null
+++ b/test/files/run/t2849.scala
@@ -0,0 +1,46 @@
+
+
+
+import scala.collection.immutable.SortedSet
+import scala.collection.immutable.TreeSet
+
+
+
+object Test {
+
+ def main(args: Array[String]) {
+ ticketExample
+ similarExample
+ }
+
+ def ticketExample {
+ var big = 100000
+
+ var aSortedSet: SortedSet[Int] = TreeSet(big)
+
+ for (i <- 1 until 10000) {
+ aSortedSet = (aSortedSet - big) ++ (TreeSet(i, big - 1))
+ big = big - 1
+ if (i % 1000 == 0) {
+ aSortedSet.until(i)
+ }
+ }
+ }
+
+ def similarExample {
+ var big = 100
+
+ var aSortedSet: SortedSet[Int] = TreeSet(big)
+
+ for (i <- 1 until 10000) {
+ aSortedSet = (aSortedSet - big) ++ (TreeSet(i, big - 1)) + big
+ big = big - 1
+ if (i % 1000 == 0) {
+ aSortedSet.until(i)
+ }
+ }
+ }
+
+}
+
+
diff --git a/test/pending/run/bug2364.check b/test/pending/run/bug2364.check
new file mode 100644
index 0000000000..219305e43a
--- /dev/null
+++ b/test/pending/run/bug2364.check
@@ -0,0 +1 @@
+<test></test>
diff --git a/test/pending/run/bug2364.scala b/test/pending/run/bug2364.scala
new file mode 100644
index 0000000000..d5805a13b8
--- /dev/null
+++ b/test/pending/run/bug2364.scala
@@ -0,0 +1,60 @@
+import java.io.ByteArrayInputStream
+import java.io.ByteArrayOutputStream
+import com.sun.xml.internal.fastinfoset._
+import com.sun.xml.internal.fastinfoset.sax._
+import scala.xml.parsing.NoBindingFactoryAdapter
+import scala.xml._
+
+// Note - this is in pending because com.sun.xml.etc is not standard,
+// and I don't have time to extract a smaller test.
+
+object Test {
+ def main(args: Array[String]) {
+ val node = <test/>
+ val bytes = new ByteArrayOutputStream
+ val serializer = new SAXDocumentSerializer()
+
+ serializer.setOutputStream(bytes)
+ serializer.startDocument()
+ serialize(node, serializer)
+ serializer.endDocument()
+ println(parse(new ByteArrayInputStream(bytes.toByteArray)))
+ }
+ def serialize(node: Node, serializer: SAXDocumentSerializer) {
+ node match {
+ case _ : ProcInstr | _ : Comment | _ : EntityRef =>
+ case x : Atom[_] =>
+ val chars = x.text.toCharArray
+ serializer.characters(chars, 0, chars.length)
+ case _ : Elem =>
+ serializer.startElement("", node.label.toLowerCase, node.label.toLowerCase, attributes(node.attributes))
+ for (m <- node.child) serialize(m, serializer)
+ serializer.endElement("", node.label.toLowerCase, node.label.toLowerCase)
+ }
+ }
+ def parse(str: ByteArrayInputStream) = {
+ val parser = new SAXDocumentParser
+ val fac = new NoBindingFactoryAdapter
+
+ parser.setContentHandler(fac)
+ try {
+ parser.parse(str)
+ } catch {
+ case x: Exception =>
+ x.printStackTrace
+ }
+ fac.rootElem
+ }
+ def attributes(d: MetaData) = {
+ val attrs = new AttributesHolder
+
+ if (d != null) {
+ for (attr <- d) {
+ val sb = new StringBuilder()
+ Utility.sequenceToXML(attr.value, TopScope, sb, true)
+ attrs.addAttribute(new QualifiedName("", "", attr.key.toLowerCase), sb.toString)
+ }
+ }
+ attrs
+ }
+}