summaryrefslogtreecommitdiff
path: root/src/library
diff options
context:
space:
mode:
authorAntonio Cunei <antonio.cunei@epfl.ch>2010-04-07 13:59:44 +0000
committerAntonio Cunei <antonio.cunei@epfl.ch>2010-04-07 13:59:44 +0000
commitc64117400e17cceb1b6e489167a71261297a7b4c (patch)
treeb942bbc7350b26f202312f0ba35f942b197d0618 /src/library
parent88bcc5c05b3abca7c2f09c59cbeceff0ab892fca (diff)
downloadscala-c64117400e17cceb1b6e489167a71261297a7b4c.tar.gz
scala-c64117400e17cceb1b6e489167a71261297a7b4c.tar.bz2
scala-c64117400e17cceb1b6e489167a71261297a7b4c.zip
Merged revisions 20582,20586,20597-20603,20607-...
Merged revisions 20582,20586,20597-20603,20607-20615,20619,20623-20625,20629-20631,20634- 20635,20639-20640,20644-20646,20649-20651,20654-20664,20672-20673,20675- 20678,20681-20683,20687-20690,20692-20693,20704-20705,20707,20710-20714, 20716,20718,20720,20723-20724,20727-20730,20734-20735,20738-20740,20744- 20745,20748,20750-20753,20756-20757,20761,20763,20767-20769,20771-20772, 20776-20781,20783,20785,20787-20791,20793-20798,20802-20803,20805-20807, 20812,20816,20818,20826,20828-20832,20834-20835,20840,20842-20844,20849- 20852,20854-20858,20862-20864,20866-20867,20869,20872-20874,20878-20881, 20884-20889,20894-20901,20905-20909,20911-20913,20917-20918,20920-20922, 20928-20929,20932-20938,20941-20942,20944-20945,20949-20970,20972-20974, 20976-21001,21003-21024,21027-21029,21031,21043-21045,21053-21054,21058- 21060,21062-21068,21071,21073-21081,21083-21088,21091-21094,21098-21103, 21105-21111,21113,21115,21121,21123-21131,21135-21142,21148-21151,21156- 21160,21162-21165,21167-21168,21171,21174-21181,21184,21186-21190,21193, 21195-21196,21199-21201,21205-21207,21210,21214-21220,21222-21250,21252- 21254,21256-21266,21269,21271,21273-21276,21278-21292,21294-21297,21299, 21303-21305,21307,21309,21313,21322-21333,21341-21351,21353-21354,21356 via svnmerge from https://lampsvn.epfl.ch/svn-repos/scala/scala/trunk ........ r20582 | extempore | 2010-01-18 22:18:36 +0100 (Mon, 18 Jan 2010) | 3 lines More work consolidating the XML code needlessly duplicated between the compiler and the library. Having to fix #2354 in two completely different places was I found very motivating. ........ r20586 | extempore | 2010-01-19 04:15:07 +0100 (Tue, 19 Jan 2010) | 14 lines Digging into why the repl is so slow, discovered that fsc is once again never reusing compiler instances (but for a different reason than #1683.) Small changes break equality and the little troopers are so darn quiet about it. Steady state, hot fsc repl startup times before this patch: 0m1.747s 0m1.789s 0m1.842s 0m1.690s After this patch: 0m1.139s 0m1.148s 0m1.090s 0m1.091s No review. Could use a test case but I have trouble coaxing partest this far outside the box. ........ r20597 | dubochet | 2010-01-19 11:52:43 +0100 (Tue, 19 Jan 2010) | 1 line [scaladoc] Search tool will ignore case for lowercase-only queries. Type return when search tool is active to immediately search and display the first result. Contributed by Johannes Rudolph. Also: removed useless `DocProvider` class. No review, checked by dubochet. ........ r20598 | dragos | 2010-01-19 14:21:03 +0100 (Tue, 19 Jan 2010) | 2 lines Fixed isClosureClass in inliner and removed it from CopyPropagation (was dead code). See #2893. ........ r20599 | odersky | 2010-01-19 17:44:20 +0100 (Tue, 19 Jan 2010) | 1 line more performance improvements; eliminated mk...Type function in Types. ........ r20600 | extempore | 2010-01-19 19:04:23 +0100 (Tue, 19 Jan 2010) | 2 lines Added test to pending with extensive exploration of behaviors of instanceOf as compared with type matching. ........ r20601 | extempore | 2010-01-19 19:24:29 +0100 (Tue, 19 Jan 2010) | 2 lines Iterators created with duplicate compare equal if they are positioned at the same element. Review by community. ........ r20602 | extempore | 2010-01-19 20:28:17 +0100 (Tue, 19 Jan 2010) | 4 lines Until now all scala builds performed not in an svn tree were given no version number, because the version was extracted from "svn info". Now it tries git style if svn info is unrevealing. Review by community. ........ r20603 | extempore | 2010-01-19 20:41:41 +0100 (Tue, 19 Jan 2010) | 1 line Test case for #2148. Closes #2148, no review. ........ r20607 | extempore | 2010-01-20 01:27:53 +0100 (Wed, 20 Jan 2010) | 5 lines Made some cosmetic but clarity-increasing changes to a few files. Primarily, used corresponds where possible rather than zipped.forall. Added isImplicit and isJava to MethodType so the relevant subtypes could be determined without the hideous isInstanceOf checks. Review by odersky. ........ r20608 | extempore | 2010-01-20 01:28:09 +0100 (Wed, 20 Jan 2010) | 1 line Fix for #2927. No review. ........ r20609 | extempore | 2010-01-20 03:36:37 +0100 (Wed, 20 Jan 2010) | 1 line Un-overloaded StringLike.format. Closes #2898. No review. ........ r20610 | extempore | 2010-01-20 03:36:51 +0100 (Wed, 20 Jan 2010) | 1 line Removed some debugging echoes I let slip through. ........ r20611 | extempore | 2010-01-20 05:38:32 +0100 (Wed, 20 Jan 2010) | 2 lines Took a slightly different tack on parsing the svn version. No review. ........ r20612 | extempore | 2010-01-20 06:50:37 +0100 (Wed, 20 Jan 2010) | 6 lines No longer are there more IDE-specific junk files in the root directory of the official scala repository than actual scala files and directories put together. It's a truly awful first impression to give potential developers so I'm interpreting the non-response regarding the need for them as quiet encouragement to put them in src/intellij. Review by ilyas. ........ r20613 | extempore | 2010-01-20 06:50:51 +0100 (Wed, 20 Jan 2010) | 2 lines Bringing README up to date and filling in some of the info gaps. Review by cunei. ........ r20614 | extempore | 2010-01-20 06:51:12 +0100 (Wed, 20 Jan 2010) | 4 lines Removed static state from global object ClassPath object, and some minor repositioning while in there. Closes #2928, but the intentions behind -optimise being intertwined with ClassPath could really use some source comments. Review by rytz. ........ r20615 | milessabin | 2010-01-20 10:12:10 +0100 (Wed, 20 Jan 2010) | 1 line Renamed new bin directory to "tools" to avoid conflict with Eclipse incremental build output directory. ........ r20619 | rytz | 2010-01-20 11:55:56 +0100 (Wed, 20 Jan 2010) | 1 line fixed bugs in .NET bytecode generation (branching out of try / catch / finally blocks is not allowed). predef.dll now almost passes PEVerify. no review ........ r20623 | dubochet | 2010-01-20 15:41:58 +0100 (Wed, 20 Jan 2010) | 1 line [scaladoc] Default values of parameters are documented. Tags "@author", "@see", "@since", "@version", and "@deprecated" are displayed in documentation. Contributed by Pedro Furlanetto, checked by dubochet, no review. ........ r20624 | plocinic | 2010-01-20 18:15:49 +0100 (Wed, 20 Jan 2010) | 1 line Closes #2653, #2652, #2556. The last one required more sophisticated mechanism for detecting invalid references to inherited members, but at least it doesn't seem to cause unnecessary recompilations. ........ r20625 | odersky | 2010-01-20 19:29:13 +0100 (Wed, 20 Jan 2010) | 2 lines Attempt to fix #2926 (companion object of case class problem in Eclipse) ........ r20629 | odersky | 2010-01-21 20:27:39 +0100 (Thu, 21 Jan 2010) | 1 line Fix for #2867 undone, review by extempore. ........ r20630 | extempore | 2010-01-21 21:52:32 +0100 (Thu, 21 Jan 2010) | 1 line Moved test case for just-reverted patch to pending. No review. ........ r20631 | extempore | 2010-01-21 22:00:08 +0100 (Thu, 21 Jan 2010) | 17 lines It's a big REPL patch. And it contains: * Eliminated a bug which was causing all repl lines to be parsed twice * Removed reference to JLine from InterpreterLoop which was causing someone trouble in eclipse * Enriched the repl compile/reflect mechanism to allow retrieving the value as well as the String describing it * Utilized said enrichment to write an eval[T] method which is exposed in the repl in :power mode * Added ability to turn off string unwrapping in repl: settings.unwrapStrings = false * Created interface presently called Completion.Special which lets objects define their own contents * As minor demonstration of above, in :power mode variable "repl" implements Special and completes with all repl identifiers * As more interesting demonstration of above, try a repl session like... import scala.tools.nsc.interpreter.Completion.Special import scala.tools.nsc.io.Process val connections = new Special { def tabCompletions() = Process("netstat -p tcp").toList drop 2 map (_ split "\\s+" apply 4) } connections.<tab> Review by community! ........ r20634 | odersky | 2010-01-22 17:50:55 +0100 (Fri, 22 Jan 2010) | 1 line Second attempt to fix #2926. Reverted first attempt. review by milessabin. ........ r20635 | rytz | 2010-01-22 17:55:23 +0100 (Fri, 22 Jan 2010) | 1 line fix several issues in .net backend / type parsing. allow re-building forkjoin.jar separately. no review ........ r20639 | odersky | 2010-01-23 18:44:20 +0100 (Sat, 23 Jan 2010) | 2 lines Closes 2926. Review by milesabin. ........ r20640 | extempore | 2010-01-23 21:30:01 +0100 (Sat, 23 Jan 2010) | 9 lines Another big REPL patch. Lacking the time right now to write a proper commit message, I will just say it adds a couple of pretty frabjous features, in addition to cleaning up a whole bunch of questionable code. * Tab-completion now chains through intermediate results on fields and 0-arg methods * You can now define custom Completors which define their own contents. Details and demos to come in a wiki document about the repl. ........ r20644 | extempore | 2010-01-24 01:31:38 +0100 (Sun, 24 Jan 2010) | 2 lines Some minor polishing to the previous repl completion patch, plus a few new features and improvements. ........ r20645 | odersky | 2010-01-24 16:05:46 +0100 (Sun, 24 Jan 2010) | 2 lines renamed notCompiles ==> canRedefine. Made code completion more robust. ........ r20646 | extempore | 2010-01-24 16:19:55 +0100 (Sun, 24 Jan 2010) | 1 line Removing a stray paren to unbreak build. No review. ........ r20649 | dragos | 2010-01-24 19:15:23 +0100 (Sun, 24 Jan 2010) | 2 lines Fixed dead code elimination to satisfy YourKit's instrumentation: a drop for a newly initialized object is always added after the constructor call, instead of immediately after a DUP ........ r20650 | plocinic | 2010-01-24 23:57:30 +0100 (Sun, 24 Jan 2010) | 1 line Added new target for partest for testing Build Manager behaviour. Updated Refined Build Manager for consistency. --buildmanager requires the test file that describes actions to be done on the refined build manager (compiling files, making changes to the classes), sources of changes, the check file and the initial scala source files. Cleaned up some code in partest, refactored some obvious code duplication. Tests to follow... Review by phaller. ........ r20651 | plocinic | 2010-01-25 00:00:02 +0100 (Mon, 25 Jan 2010) | 1 line First test for buildmanager. No review necessary ........ r20654 | plocinic | 2010-01-25 11:28:11 +0100 (Mon, 25 Jan 2010) | 1 line Fix ant task for build manager partest. no review. ........ r20655 | rytz | 2010-01-25 11:44:16 +0100 (Mon, 25 Jan 2010) | 1 line small improvements in building newlibs / newforkjoin. no review. ........ r20656 | dubochet | 2010-01-25 12:31:44 +0100 (Mon, 25 Jan 2010) | 1 line [scaladoc] Deprecated methods are striked-out. There is an `implict` keyword in front of implicit parameters. Fixed an issue with inherited constructors printed in documentation. Contributed by Pedro Furlanetto, checked by dubochet, no review. ........ r20657 | dragos | 2010-01-25 13:43:40 +0100 (Mon, 25 Jan 2010) | 2 lines Fixed #2497: replaceInstruction now uses reference equality. No review necessary. ........ r20658 | rytz | 2010-01-25 15:20:52 +0100 (Mon, 25 Jan 2010) | 1 line close #2929. review by community (see typedValDef in Typers.scala, the BYNAMEPARAM flag was missing). ........ r20659 | dubochet | 2010-01-25 15:23:36 +0100 (Mon, 25 Jan 2010) | 1 line [scaladoc] Classes `FunctionX`, `ProductX`, and `TupleX`, for `X` greater than 2, are not listed. Contributed by Pedro Furlanetto, checked by dubochet, no review. ........ r20660 | milessabin | 2010-01-25 17:14:02 +0100 (Mon, 25 Jan 2010) | 1 line Use file paths for equality and hashCode to deal with mixed-type file equality test issues in the IDE. Fixes #2931. ........ r20661 | prokopec | 2010-01-25 19:21:04 +0100 (Mon, 25 Jan 2010) | 2 lines Replacement in matching can now be done by providing function arguments for replacement. Fixes #2761. Review by phaller. ........ r20662 | prokopec | 2010-01-25 19:21:32 +0100 (Mon, 25 Jan 2010) | 1 line Test file for matching with replace. ........ r20663 | odersky | 2010-01-25 19:32:56 +0100 (Mon, 25 Jan 2010) | 1 line Fixed stability problem with build. ........ r20664 | prokopec | 2010-01-25 20:22:06 +0100 (Mon, 25 Jan 2010) | 1 line Fixes #2766. Review by phaller. ........ r20672 | plocinic | 2010-01-26 11:03:04 +0100 (Tue, 26 Jan 2010) | 1 line Changed the info statements in refined build manager to print the information in more consistent way, so that we can test it using partest. Added more tests for build manager, more to follow... No review. ........ r20673 | plocinic | 2010-01-26 11:33:10 +0100 (Tue, 26 Jan 2010) | 1 line More tests, plus missing file. No review. ........ r20675 | plocinic | 2010-01-26 14:38:38 +0100 (Tue, 26 Jan 2010) | 1 line Hack for the refined build manager to print info in a deterministic way. No review. ........ r20676 | prokopec | 2010-01-26 15:03:19 +0100 (Tue, 26 Jan 2010) | 1 line Access modifiers added for certain members and some refactoring in Regex. ........ r20677 | dubochet | 2010-01-26 15:03:24 +0100 (Tue, 26 Jan 2010) | 1 line Fixed a number of faulty Scaladoc comments in library and compiler sources. No review. ........ r20678 | plocinic | 2010-01-26 15:21:58 +0100 (Tue, 26 Jan 2010) | 1 line Another batch of tests. No review. ........ r20681 | extempore | 2010-01-26 23:14:15 +0100 (Tue, 26 Jan 2010) | 13 lines Refinements to the recent repl patches. You can now complete on a few more things, like literals (1.<tab>, "abc".<tab>). A completion aware case class walker which leverages the names of the case fields for completion. For instance: :power val x = new ProductCompletion(mkTree("def f(x: Int, y: Int) = f(5, 10) + f(10, 20)") x.<tab> mods name rhs tparams tpt vparamss x.rhs.fun.<tab> name qualifier scala> x.rhs.fun.qualifier res3: scala.tools.nsc.ast.Trees$Apply = f(5, 10) ........ r20682 | plocinic | 2010-01-27 00:01:41 +0100 (Wed, 27 Jan 2010) | 1 line Refactored some of the code from r20624 thanks to Iulian's review. ........ r20683 | plocinic | 2010-01-27 00:08:20 +0100 (Wed, 27 Jan 2010) | 1 line Cleaning up. No review. ........ r20687 | phaller | 2010-01-27 10:07:56 +0100 (Wed, 27 Jan 2010) | 1 line Made actor-receivewithin test deterministic. ........ r20688 | rytz | 2010-01-27 11:23:31 +0100 (Wed, 27 Jan 2010) | 1 line close #2868. problem was: when the same constant is used in a ConstantType and a LiteralAnnotArg, it is stored inside the unpickle cache 'entries' (see 'def at' in UnPickler) once as Constant, once as LiteralAnnotArg, resulting in a CCE for the second read. review by extempore. ........ r20689 | rytz | 2010-01-27 11:28:52 +0100 (Wed, 27 Jan 2010) | 1 line fix pickle format doc. no review ........ r20690 | rytz | 2010-01-27 12:08:54 +0100 (Wed, 27 Jan 2010) | 1 line reverting r20688 for now, no review ........ r20692 | plocinic | 2010-01-27 13:56:24 +0100 (Wed, 27 Jan 2010) | 1 line Closes #2966. Review by milessabin. ........ r20693 | plocinic | 2010-01-27 14:15:16 +0100 (Wed, 27 Jan 2010) | 1 line Forgot to commit the change. No review. ........ r20704 | rytz | 2010-01-27 15:12:28 +0100 (Wed, 27 Jan 2010) | 1 line now correctly fix #2868. no review ........ r20705 | extempore | 2010-01-27 17:53:36 +0100 (Wed, 27 Jan 2010) | 1 line Fix for #2563. Review by mharrah. ........ r20707 | extempore | 2010-01-27 22:27:22 +0100 (Wed, 27 Jan 2010) | 1 line Some hardening of repl generated code. No review. ........ r20710 | extempore | 2010-01-28 06:46:06 +0100 (Thu, 28 Jan 2010) | 6 lines One of those "$.05 for the bolt, $50,000 for knowing where to put it" commits. Closes #425, #816, #2310, #2691. All credit for this patch goes to me for having the genius to know when new eyes were needed (although if you're feeling generous some could also go to walter korman for the actual debugging and code writing part.) ........ r20711 | extempore | 2010-01-28 06:46:36 +0100 (Thu, 28 Jan 2010) | 5 lines Moved some test cases from pending to files since the bugs they were watching for seem to be fixed. Moved some other test cases from pending to disabled because they deceptively claim to pass while investigation reveals the ticket needs to remain open. Closes #1996, #2660. ........ r20712 | rytz | 2010-01-28 11:26:00 +0100 (Thu, 28 Jan 2010) | 1 line close #2886 (applied patch). no review ........ r20713 | dubochet | 2010-01-28 11:48:38 +0100 (Thu, 28 Jan 2010) | 6 lines [scaladoc] Comment parsing is improved: * tags in code blocks no longer confuse the parser; * `@note` and `@example` are recognised tags; * Empty comments no longer generate "must start with a sentence" warnings; * `@usecase` parsing works better in some situations with blank comment lines above or below. No review. ........ r20714 | dubochet | 2010-01-28 11:48:53 +0100 (Thu, 28 Jan 2010) | 1 line [scaladoc] In HTML documentation, `@return` tag is printed also when there is no `@param` tag present. No review. ........ r20716 | rytz | 2010-01-28 15:14:20 +0100 (Thu, 28 Jan 2010) | 1 line Removing defaultGetter field from TermSymbols. review by odersky (see 'def defaultGetter' in typechecker/NamesDefaults.scala) ........ r20718 | extempore | 2010-01-28 17:16:22 +0100 (Thu, 28 Jan 2010) | 9 lines Added :search to power mode for finding classes on the classpath by regular expression, plus a bunch of compiler hacker convenience methods to the repl. Now after :power you can: mkContext() // Context mkUnit("class Q") // CompilationUnit mkTypedTree("class A { val x = 5 }") // Tree after phase typer mkType("java.util.Map") // Type object ... and many more. No review. ........ r20720 | extempore | 2010-01-28 23:55:42 +0100 (Thu, 28 Jan 2010) | 2 lines Added a command line option for desugaring match blocks differently for debugging purposes. No review. ........ r20723 | extempore | 2010-01-29 01:37:07 +0100 (Fri, 29 Jan 2010) | 6 lines I'm sure I'm not the only one driven into paroxysms of annoyance at the fact that repl transcripts are not at all usable in the repl. No longer: now you can paste a transcript into the repl and it will automatically detect it as such, clean it up and replay it. It is triggered by the "scala> " on the first line of the transcript. Review by community. ........ r20724 | plocinic | 2010-01-29 12:23:52 +0100 (Fri, 29 Jan 2010) | 1 line Closes #2650. Dependency on type alias requires analysis before uncurry phase. Added tests. Review by dragos. ........ r20727 | odersky | 2010-01-29 15:04:06 +0100 (Fri, 29 Jan 2010) | 1 line new starr to bag performance improvements and fixes to companion objects ........ r20728 | extempore | 2010-01-29 16:43:41 +0100 (Fri, 29 Jan 2010) | 2 lines Added hashCode implementations to Manifest types where necessary. Closes #2838. No review. ........ r20729 | extempore | 2010-01-29 20:11:20 +0100 (Fri, 29 Jan 2010) | 3 lines Implemented rompf's suggested improvement to the tail recursive combinators, avoiding re-evaluation of by-name argument. Score one for code review. No review. (Ironic.) ........ r20730 | extempore | 2010-01-29 20:11:38 +0100 (Fri, 29 Jan 2010) | 3 lines A few compiler IO lib bits I have been needing: some basic conveniences for directories and sockets, and some cleanups in CompileSocket. Review by community. ........ r20734 | extempore | 2010-01-30 07:30:57 +0100 (Sat, 30 Jan 2010) | 6 lines A compact tree printer, for primitives like myself who do all their debugging in the console and need extraneous information filtered out. New option: -Ycompact-trees. Supply that in conjunction with -Xprint:all and suddenly the output is a (relative) masterpiece of concision. Review by anyone who is game to review such a thing. Community? ........ r20735 | plocinic | 2010-01-30 18:06:34 +0100 (Sat, 30 Jan 2010) | 1 line Better test for checking existential types, where symbols not necessarily have the same name. Added test for that. The problem manifested itself in Globals.scala for variable classpath causing execissive compilation without any reason. No review. ........ r20738 | extempore | 2010-01-31 06:39:25 +0100 (Sun, 31 Jan 2010) | 2 lines Band-aid for #3004. No review unless you want to take on name mangling and forwarders, in which case review away. ........ r20739 | extempore | 2010-01-31 17:34:24 +0100 (Sun, 31 Jan 2010) | 4 lines Great moments in typos: somehow the "decodeUni" in CharArrayReader had transmogrified into "decodeUnit" in UnitScanner, thus causing -Xno-uescape to be ignored. Also, removed a now unused -X option. Review by community. ........ r20740 | extempore | 2010-01-31 17:59:19 +0100 (Sun, 31 Jan 2010) | 2 lines Solidified the logic of stringOf for repl results printing. Closes #726. Review by community. ........ r20744 | extempore | 2010-02-01 02:24:11 +0100 (Mon, 01 Feb 2010) | 2 lines Unbroke the build. Remember kids, "Node extends NodeSeq extends Seq[Node]" means never having to meet your base case. No review. ........ r20745 | plocinic | 2010-02-01 10:34:28 +0100 (Mon, 01 Feb 2010) | 1 line Check recursively the type aliases. Closes #2650. Review by dragos. ........ r20748 | odersky | 2010-02-01 16:10:26 +0100 (Mon, 01 Feb 2010) | 1 line lifted out core compiler data structures into reflect.generic package. Made Unpickler work on generic data. ........ r20750 | odersky | 2010-02-01 16:49:33 +0100 (Mon, 01 Feb 2010) | 1 line missing bits of r20746. For some reasons smartsvn did not show these before. review by dubochet, extempore. ........ r20751 | extempore | 2010-02-01 17:06:14 +0100 (Mon, 01 Feb 2010) | 3 lines Removed scala.util.NameTransformer (it moved to reflect.) We don't have to @deprecate it since it's never been in a released version. No review. ........ r20752 | plocinic | 2010-02-01 17:16:11 +0100 (Mon, 01 Feb 2010) | 1 line Exclude anonymous function classes from the definitions in dependency analysis. This was causing spurious errors in for example Global.scala and Interpreter.scala because of fresh names numbering. Also cleanup up some code. No review. ........ r20753 | odersky | 2010-02-01 18:15:05 +0100 (Mon, 01 Feb 2010) | 1 line suppresses generation of manifests for abstract type members. ........ r20756 | extempore | 2010-02-01 23:35:12 +0100 (Mon, 01 Feb 2010) | 2 lines Quite a lot more work on completion. The main bit is that file completion is now avilable, with some caveats. Review by community. ........ r20757 | extempore | 2010-02-02 00:19:33 +0100 (Tue, 02 Feb 2010) | 3 lines Continuing the fine work creating an abstract interface to the compiler in scala.reflect.generic, promoted Trees#Traverser and made the associated changes. Review by odersky. ........ r20761 | plocinic | 2010-02-02 11:36:38 +0100 (Tue, 02 Feb 2010) | 1 line Cleaned up the code slightly. No review. ........ r20763 | plocinic | 2010-02-02 12:28:56 +0100 (Tue, 02 Feb 2010) | 1 line Fixed tests. No review. ........ r20767 | phaller | 2010-02-02 13:40:38 +0100 (Tue, 02 Feb 2010) | 1 line Closes #3009. ........ r20768 | plocinic | 2010-02-02 15:51:03 +0100 (Tue, 02 Feb 2010) | 1 line Correctly check annotated types. Problem showed up for example in Interpreter.scala. No review. ........ r20769 | extempore | 2010-02-02 17:46:23 +0100 (Tue, 02 Feb 2010) | 5 lines Hid some AST nodes from the prying eyes of reflectors. Now Parens, AssignOrNamedArg, and DocDef are known only to scalac. Also some cosmetic arranging in the new reflect.generic package, because there's never a better time than when the code is still warm from the compiler. Review by odersky. ........ r20771 | extempore | 2010-02-02 20:43:07 +0100 (Tue, 02 Feb 2010) | 9 lines Took a swing at sorting out sorting. The major components are rewriting the Sorting methods to accept Orderings and adding a sorted method to SeqLike, because we should all be pretty tired of writing ".sortWith(_ < _)" by now. I think it should be called "sort", not "sorted", but that refuses to coexist gracefully with the deprecated sort in List. Review by moors (chosen pretty arbitrarily, someone at epfl should review it but I don't know who deserves the nomination.) ........ r20772 | extempore | 2010-02-02 21:13:28 +0100 (Tue, 02 Feb 2010) | 5 lines It was pointed out that sorted and the 1-arg version of sortWith are the same method, one with implicit argument, one without. Since sortWith has never exist in a release, we can un-overload it (which is a win anyway) and route everything through sorted. Review by moors. ........ r20776 | plocinic | 2010-02-03 11:22:27 +0100 (Wed, 03 Feb 2010) | 1 line Fixes the problem mentioned in #2882, which seems to be the reason for #2280 - allow simple ananlysis on java sources. Review by dragos ........ r20777 | extempore | 2010-02-03 16:52:25 +0100 (Wed, 03 Feb 2010) | 2 lines Made sliding/grouped throw an exception when read past the end. Closes #3017. ........ r20778 | dubochet | 2010-02-03 18:03:58 +0100 (Wed, 03 Feb 2010) | 1 line [scaladoc] Optional link to source (set parameter "-doc-source-url"). Support for commenting packages (using package objects). Contributed by Perdo Furlanetto. Also: small performance improvements, short comment extraction is more robust (but no HTML tags allowed in first sentence), small code clean-ups. Checked by dubochet, no review. ........ r20779 | extempore | 2010-02-03 18:34:31 +0100 (Wed, 03 Feb 2010) | 5 lines Striking while the iron is hot, renamed removeDuplicates to unique and deprecated removeDuplicates. The debate between distinct and unique was vigorous but unique won by a freckle. (Dark horse 'nub' was disqualified for taking performance enhancers.) The only thing which might need review is the choice of name, but review by odersky. ........ r20780 | dpp | 2010-02-03 19:04:50 +0100 (Wed, 03 Feb 2010) | 1 line Fixed XML Utility.escape method to conform to XML spec. Closes #3014 ........ r20781 | dragos | 2010-02-03 19:17:17 +0100 (Wed, 03 Feb 2010) | 2 lines Preserve source order for class members in generated bytecode. No review necessary. ........ r20783 | extempore | 2010-02-03 22:12:56 +0100 (Wed, 03 Feb 2010) | 2 lines Created MSILGlobal to start breaking the dependency on msil.jar for those platforms which don't use msil. Review by rytz. ........ r20785 | extempore | 2010-02-04 00:51:49 +0100 (Thu, 04 Feb 2010) | 7 lines A big push to make the interpreter easier to instantiate without having to dodge bullets. It shouldn't have to be any harder than this: scala> new scala.tools.nsc.Interpreter().evalExpr[Int]("5*5") res0: Int = 25 ...and now it isn't. Review by community. ........ r20787 | extempore | 2010-02-04 13:26:02 +0100 (Thu, 04 Feb 2010) | 3 lines Noticed that all the system properties were being read into vals so they didn't notice changes. Determined this was not correct, and changed them into defs. No review. ........ r20788 | extempore | 2010-02-04 13:26:42 +0100 (Thu, 04 Feb 2010) | 2 lines Unique's seeming victory is overruled by committee. It is "distinct", not "unique", wherein lies the nub. No review. ........ r20789 | extempore | 2010-02-04 14:33:06 +0100 (Thu, 04 Feb 2010) | 5 lines The remainder of isolating MSIL from the rest of the classpath code. To accomplish this I made ClassRep an inner class of ClassPath (which given the broad definition of ClassPath already in place, it conceptually is already) and as a bonus this allowed dropping its type parameter. Review by rytz. ........ r20790 | extempore | 2010-02-04 16:22:03 +0100 (Thu, 04 Feb 2010) | 3 lines Some minor cleanups in reflect. Moved the apply on Traverser back into the compiler so Traversers can define whatever apply is relevant to them. No review. ........ r20791 | phaller | 2010-02-04 18:03:57 +0100 (Thu, 04 Feb 2010) | 1 line Fixed issue in partest where result of tests that timed out was not printed. Improved reporting. Added support for JUnit report files. ........ r20793 | extempore | 2010-02-04 18:15:18 +0100 (Thu, 04 Feb 2010) | 2 lines Some hardening in the repl, and removing some functions which now exist in the standard library. No review. ........ r20794 | extempore | 2010-02-04 19:50:47 +0100 (Thu, 04 Feb 2010) | 6 lines Raised the level of abstraction (slightly, not enough) on ClassPath by defining the difference between optimized and regular classpaths in terms of an arbitrary name filter instead of in terms of settings.XO. Altered the decision logic to look at the value of -Yinline instead of -optimise. Closes #2950. Review by rytz. ........ r20795 | rompf | 2010-02-04 20:02:16 +0100 (Thu, 04 Feb 2010) | 1 line Added byval mode and annotation checker hook for weak lub. Review by odersky. ........ r20796 | extempore | 2010-02-04 21:21:44 +0100 (Thu, 04 Feb 2010) | 9 lines Made a whole WithFilter class for Option after discovering this bug: scala> def f(x: AnyRef) = for (p <- Option(x)) yield p f: (x: AnyRef)Option[AnyRef] scala> def f(x: AnyRef) = for (p <- Option(x) ; if true) yield p f: (x: AnyRef)Iterable[AnyRef] The for comprehension logic apparently prefers to convert Option to Iterable to get at the withFilter method over using Option's filter. ........ r20797 | rompf | 2010-02-04 21:52:03 +0100 (Thu, 04 Feb 2010) | 1 line fixed previous commit. No review. ........ r20798 | extempore | 2010-02-05 00:08:44 +0100 (Fri, 05 Feb 2010) | 1 line Taking a swing at fixing -optimise. No review. ........ r20802 | dubochet | 2010-02-05 14:12:34 +0100 (Fri, 05 Feb 2010) | 1 line [scaladoc] Fixed issue with failing Windows build. Code by Pedro Furlanetto, no review. ........ r20803 | ilyas | 2010-02-05 17:53:41 +0100 (Fri, 05 Feb 2010) | 1 line some scalap tweaks ........ r20805 | extempore | 2010-02-05 19:33:09 +0100 (Fri, 05 Feb 2010) | 2 lines Discovered that List's deprecated removeDuplicates didn't survive the renaming of unique to distinct. No review. ........ r20806 | ilyas | 2010-02-05 21:06:07 +0100 (Fri, 05 Feb 2010) | 1 line scalap output bug fixed ........ r20807 | ilyas | 2010-02-05 21:54:40 +0100 (Fri, 05 Feb 2010) | 1 line testdata changed ........ r20812 | extempore | 2010-02-06 05:15:08 +0100 (Sat, 06 Feb 2010) | 4 lines A more MSIL-aware attempt at isolating the platform dependent pieces of Global and ClassPath so we don't introduce unwanted dependencies. Introduces a small interface backend.Platform which encapsulates that data. Review by rytz, odersky. ........ r20816 | extempore | 2010-02-07 01:13:55 +0100 (Sun, 07 Feb 2010) | 2 lines Some code duplication removal as I inch us toward consistent classpath handling. No review. ........ r20818 | milessabin | 2010-02-07 23:14:40 +0100 (Sun, 07 Feb 2010) | 1 line IntelliJ project metadata updated for new location. Thanks to Tony Coates for the patch. ........ r20826 | rytz | 2010-02-08 16:37:34 +0100 (Mon, 08 Feb 2010) | 1 line fix msil build (nested classes in particular). no review. ........ r20828 | phaller | 2010-02-08 18:15:22 +0100 (Mon, 08 Feb 2010) | 1 line Re-added deprecated member to scala.actors.Future. No review necessary. ........ r20829 | dubochet | 2010-02-08 22:28:30 +0100 (Mon, 08 Feb 2010) | 5 lines [scaladoc] Many improvements in the UI for Scaladoc's entity index (left-pane): - It is possible to "focus" on a package to restrict searches on it. - Filtering in left pane no longer blocks the UI. - The filter tool for packages is easily recognizable for what it is, not just an empty, mysterious space. Review by community. ........ r20830 | dubochet | 2010-02-08 22:28:47 +0100 (Mon, 08 Feb 2010) | 1 line [scaladoc] Fully qualified names are displayed in tooltips instead of using in-place growth. All inherited members can be filtered in a single operation. Contributed by Pedro Furlanetto, checked by dubochet, no review. ........ r20831 | extempore | 2010-02-08 23:28:17 +0100 (Mon, 08 Feb 2010) | 20 lines Some work on classpaths. This implements the specification at https://lampsvn.epfl.ch/trac/scala/wiki/Classpath modulo some minor details which remain to be investigated. It is not entirely integrated, and should not involve any behavioral changes. The patch also contains a number of small improvements targetting widely duplicated code. PathResolver offers a main method. If run with no arguments it will output a pile of information about classpath relevant environment vars and properties. If given arguments, it will output the classpath info that any scala runner script would use if given the same args. There is a wrapper in the tools directory. Example: tools/pathResolver -extdirs /foo -sourcepath /bar | egrep "sourcePath|scalaExtDirs" scalaExtDirs = /foo sourcePath = /bar There is also a (probably temporary) command line option -Ylog-classpath which will print out the settings.classpath value anytime it changes. Review by community. ........ r20832 | extempore | 2010-02-09 00:09:49 +0100 (Tue, 09 Feb 2010) | 8 lines Until now directories on the classpath were not considered for repl completion because of the risk of accidentally traversing large chunks of the filesystem (e.g. "." in the path, run from /). Some low hanging fruit was available - at least SCALA_HOME can be considered safe, and then we get the scala classes. The main impact of this is that completion now works for the built-in classes when you run build/quick/bin/scala. Review by community. ........ r20834 | milessabin | 2010-02-09 10:58:24 +0100 (Tue, 09 Feb 2010) | 1 line Export missing package. ........ r20835 | milessabin | 2010-02-09 11:15:16 +0100 (Tue, 09 Feb 2010) | 1 line Compiler part of fix for #2767: provide hooks to allow the presentation compiler to add sources to the run to resolve top-level symbols which cannot be found via the Java naming convention. Review by odersky. ........ r20840 | prokopec | 2010-02-09 16:31:11 +0100 (Tue, 09 Feb 2010) | 1 line `replaceSomeIn` method added. Removed `replaceAllIN` taking a String to String. ........ r20842 | extempore | 2010-02-09 19:37:44 +0100 (Tue, 09 Feb 2010) | 1 line No double-processing format strings. Closes #3040. No review. ........ r20843 | rompf | 2010-02-09 20:08:07 +0100 (Tue, 09 Feb 2010) | 1 line some small byval mode changes. review by odersky. ........ r20844 | dragos | 2010-02-09 20:21:34 +0100 (Tue, 09 Feb 2010) | 1 line Fixed partially specialized classes. Closes #2880. No review. ........ r20849 | rytz | 2010-02-10 10:03:13 +0100 (Wed, 10 Feb 2010) | 1 line close #3003. no review, already done by dragos. ........ r20850 | prokopec | 2010-02-10 12:12:55 +0100 (Wed, 10 Feb 2010) | 1 line fixes #3046 ........ r20851 | rompf | 2010-02-10 15:21:57 +0100 (Wed, 10 Feb 2010) | 1 line modified typing of while loops to allow other types than Unit (e.g. Unit @cps). review by odersky. ........ r20852 | rytz | 2010-02-10 15:48:58 +0100 (Wed, 10 Feb 2010) | 1 line close #2984. review by community. ........ r20854 | plocinic | 2010-02-10 17:51:13 +0100 (Wed, 10 Feb 2010) | 1 line Closes #2651 ........ r20855 | phaller | 2010-02-10 18:11:11 +0100 (Wed, 10 Feb 2010) | 1 line partest no longer treats remaining .log files as tests. No review necessary. ........ r20856 | extempore | 2010-02-10 20:51:38 +0100 (Wed, 10 Feb 2010) | 5 lines More work on classpaths. This commit should restore the ability to have command line options following source files, at the price of temporarily breaking tools/pathResolver. Working my way through all the usages of classpath in trunk zeroing in on fully consistent handling. Review by community. ........ r20857 | extempore | 2010-02-11 00:20:02 +0100 (Thu, 11 Feb 2010) | 1 line Disabled failing test. Review by plocinic. ........ r20858 | plocinic | 2010-02-11 00:29:54 +0100 (Thu, 11 Feb 2010) | 1 line Fixes #3045. No review. ........ r20862 | plocinic | 2010-02-11 11:14:59 +0100 (Thu, 11 Feb 2010) | 1 line Removed leftovers of r20857, added test for #3045 ........ r20863 | extempore | 2010-02-11 14:47:13 +0100 (Thu, 11 Feb 2010) | 2 lines Trying to get when "." is added to the classpath under control. Band-aid for an obscure bit of fallout closes #3049. No review. ........ r20864 | dubochet | 2010-02-11 14:55:34 +0100 (Thu, 11 Feb 2010) | 1 line [scaladoc] Fixed popup content lookup so that it works on all browsers. Speed-up in entity index search (according to jQuery manual, observed no notable difference). Some small aesthetic cleanups in the way index initialization and filtering behaves. No review. ........ r20866 | extempore | 2010-02-11 16:10:45 +0100 (Thu, 11 Feb 2010) | 6 lines More work on classpaths. This commit also unbreaks fsc, for which we must have no test cases at all. In the short term there will probably be a few more minor disruptions since with classpaths constructed a half dozen different ways, achieving consistency requires flushing out the undocumented accidents upon which any given island might depend. Review by community. ........ r20867 | rompf | 2010-02-11 17:58:12 +0100 (Thu, 11 Feb 2010) | 1 line added annotation checker hook for Types.isWithinBounds. needed to allow functions of type T => A @cps[B,C] even though A @cps[B,C] is not a subtype of Any. review by odersky. ........ r20869 | extempore | 2010-02-12 00:37:15 +0100 (Fri, 12 Feb 2010) | 8 lines The non-intrusive bits of my hopefully pending "use the static type of the scrutinee to rule out some type/extractor patterns" patch. Includes a cleanup of the (still inadequate) type-parameter-ignoring match test which had been interfering with martin's digestion. Also: implicit search is disabled when typing a pattern, because the matcher never invokes implicits to satisfy a pattern. At worst maybe we'll get a performance bump. No review. ........ r20872 | dubochet | 2010-02-12 16:53:39 +0100 (Fri, 12 Feb 2010) | 1 line [scaladoc] Fixes for IE 8 compatibility. Partially contributed by Pedro Furlanetto, no review. ........ r20873 | prokopec | 2010-02-12 19:20:26 +0100 (Fri, 12 Feb 2010) | 1 line Fixes #3046 once more. No review is necessary. ........ r20874 | dubochet | 2010-02-12 19:43:49 +0100 (Fri, 12 Feb 2010) | 1 line [scaladoc] Added "display packages only" filter to entity index. No review. ........ r20878 | extempore | 2010-02-14 01:08:01 +0100 (Sun, 14 Feb 2010) | 7 lines Some change to classpath handling in r20866 has left quick in a condition where it won't load Array. After a fair bit of beating my head against the wall as to why, I determined that everything works if I simply don't throw the exception it used to throw. In the short term I am committing this so quick works, and I will continue the investigation. Review by dragos (2 line patch to minimize reviewer burden.) ........ r20879 | extempore | 2010-02-14 02:00:37 +0100 (Sun, 14 Feb 2010) | 2 lines Reducing the amount of low-level classpath manipulation going on around town. No review. ........ r20880 | extempore | 2010-02-14 07:52:11 +0100 (Sun, 14 Feb 2010) | 2 lines Added some error logic so if #2956 strikes again we'll have a better idea why. No review. ........ r20881 | extempore | 2010-02-14 09:47:18 +0100 (Sun, 14 Feb 2010) | 4 lines More classpath work, and cleanups in the vicinities of everything manipulating classpaths. Review by anyone willing to slog through the approximately dozen different ways the classpath can be influenced. ........ r20884 | extempore | 2010-02-15 07:32:50 +0100 (Mon, 15 Feb 2010) | 3 lines Restored the disabled exception in classfileparser. Strange quick behavior was being caused by multiple occurrences of some classpath elements. No review. ........ r20885 | dragos | 2010-02-15 11:12:00 +0100 (Mon, 15 Feb 2010) | 1 line Merge branch 'fix-specialized' ........ r20886 | extempore | 2010-02-15 17:41:17 +0100 (Mon, 15 Feb 2010) | 6 lines Disabled JavaInteraction test. This test has been costing me a lot of time because it fails if you can't connect to the screen of the test machine. And then if any test fails, the stability test doesn't run. We badly a separate testing area for tests which are prone to failure for reasons which are unrelated to the quality ostensibly being tested. No review. ........ r20887 | extempore | 2010-02-15 17:55:21 +0100 (Mon, 15 Feb 2010) | 1 line Fix for the out-of-date showpickled. No review. ........ r20888 | extempore | 2010-02-15 21:00:36 +0100 (Mon, 15 Feb 2010) | 18 lines Some new tools for the tools directory. Everything in this commit amounts to a yak shaving expedition to enable this, which now works: tools/diffPickled scala.Either and since stability is presently broken you will see the following. (When it's not broken you will see nothing.) 541,544c541,544 < 538,4090: EXTref 3: 539(Left) 2 < 539,4095: TYPEname 4: Left < 540,4101: EXTref 3: 541(Right) 2 < 541,4106: TYPEname 5: Right --- > 538,4090: EXTref 3: 539(Right) 2 > 539,4095: TYPEname 5: Right > 540,4102: EXTref 3: 541(Left) 2 > 541,4107: TYPEname 4: Left ........ r20889 | extempore | 2010-02-15 22:44:28 +0100 (Mon, 15 Feb 2010) | 2 lines Rewrote my own submitted code of a year ago from trac and added scalawhich to the tools dir. Closes #657. ........ r20894 | extempore | 2010-02-16 17:31:29 +0100 (Tue, 16 Feb 2010) | 1 line Some minor bugfixing/refining to completion. No review. ........ r20895 | extempore | 2010-02-16 20:57:44 +0100 (Tue, 16 Feb 2010) | 2 lines Some prestidigitation improving the repl startup time. The prompt is quicker than the eye! No review. ........ r20896 | extempore | 2010-02-16 20:59:22 +0100 (Tue, 16 Feb 2010) | 1 line Last minute change broke the last commit. Fixing. No review. ........ r20897 | extempore | 2010-02-16 22:03:43 +0100 (Tue, 16 Feb 2010) | 1 line Trying again to unbreak the repl patch. No review. ........ r20898 | extempore | 2010-02-16 22:22:10 +0100 (Tue, 16 Feb 2010) | 2 lines Unix scripts pass -D options to the underlying JVM invocation. Closes #1222. Review by community. ........ r20899 | extempore | 2010-02-16 22:41:01 +0100 (Tue, 16 Feb 2010) | 2 lines Made partest stop crashing on test directories without a lib directory. No review. ........ r20900 | extempore | 2010-02-16 23:28:27 +0100 (Tue, 16 Feb 2010) | 3 lines Altered Symbol.isLess to sort on initName before id. No longer will slightly different classpaths break the stability test. Review by odersky. ........ r20901 | extempore | 2010-02-17 00:48:32 +0100 (Wed, 17 Feb 2010) | 5 lines Took a less ambitious approach to restoring stability. Leave isLess as it was and have the pickler sort without using isLess. Interestingly this approach still leaves a class failing the stability test (scala/actors/remote/Apply0.class) so a little more will be needed. Review by odersky. ........ r20905 | milessabin | 2010-02-17 01:47:55 +0100 (Wed, 17 Feb 2010) | 1 line Fix and test case for #3031. Review by odersky. ........ r20906 | plocinic | 2010-02-17 12:34:35 +0100 (Wed, 17 Feb 2010) | 1 line Checking the symbols of parameters in overloaded methods didn't seem to work in all cases. Apparently the enclosing class of the owner of the parameter was changing during the compilations from trait to the implementation class. This was causing annoying excessive compilation for Types.scala. ........ r20907 | prokopec | 2010-02-17 13:22:37 +0100 (Wed, 17 Feb 2010) | 1 line Array copy method fixed, Fixes #3065. Review by community. ........ r20908 | prokopec | 2010-02-17 13:23:13 +0100 (Wed, 17 Feb 2010) | 1 line Test file for last commit. ........ r20909 | milessabin | 2010-02-17 18:15:51 +0100 (Wed, 17 Feb 2010) | 1 line Fix for silly mistake in [20835]. No review. ........ r20911 | extempore | 2010-02-17 20:01:22 +0100 (Wed, 17 Feb 2010) | 1 line Added a copy() method to Settings. No review. ........ r20912 | extempore | 2010-02-17 22:26:02 +0100 (Wed, 17 Feb 2010) | 5 lines A variety of changes to partest made in a quest to get it to reveal the classpaths it is using. No longer will partest actively sabotage your efforts to pass -Dpartest.debug=true by inserting "-Dpartest.debug=" after yours! And etc. Review by haller (if so inclined.) ........ r20913 | extempore | 2010-02-17 22:37:08 +0100 (Wed, 17 Feb 2010) | 2 lines ...and managed to miss the key file in getting past partest. No review. ........ r20917 | extempore | 2010-02-18 06:39:18 +0100 (Thu, 18 Feb 2010) | 4 lines Fix for recent stability issue with Apply0. Look, shuffling classpaths flushes out bugs, we should do this all the time. Review by odersky even though he authored it, because reliving one's own patches is the key to a long and healthy life. ........ r20918 | extempore | 2010-02-18 06:40:27 +0100 (Thu, 18 Feb 2010) | 10 lines Tighter pattern matching hits the street. If the scrutinee is final and does not conform to the pattern, it will no longer compile. See all the exciting things you can no longer do: "bob".reverse match { case Seq('b', 'o', 'b') => true } // denied! "bob".toArray match { case Seq('b', 'o', 'b') => true } // rejected! final class Dunk def f3(x: Dunk) = x match { case Seq('b', 'o', 'b') => true } // uh-uh! And so forth. Review by odersky. ........ r20920 | extempore | 2010-02-18 19:49:21 +0100 (Thu, 18 Feb 2010) | 2 lines The first reasonably satisfying classpath commit. We are 90% there with this one. Documentation to come. Review by community. ........ r20921 | extempore | 2010-02-18 20:00:46 +0100 (Thu, 18 Feb 2010) | 1 line Made NumericRange invariant again, plus test case. Closes #2518. ........ r20922 | milessabin | 2010-02-18 22:51:39 +0100 (Thu, 18 Feb 2010) | 1 line Patch from Mirko Stocker correcting start postions of import AST nodes for refactoring and other tools. Review by community. ........ r20928 | milessabin | 2010-02-19 16:31:36 +0100 (Fri, 19 Feb 2010) | 1 line Fixed #3043 and #3043; fixed a regression with hover/hyperlinks on import statements; don't attempt to parse out top-level types from non-Scala sources. Review by community. ........ r20929 | extempore | 2010-02-19 22:43:35 +0100 (Fri, 19 Feb 2010) | 1 line Some script fixes tied up with classpaths. No review. ........ r20932 | extempore | 2010-02-20 05:47:43 +0100 (Sat, 20 Feb 2010) | 2 lines Temporarily reverting r20928 as it is leading to ant dist crashing. Review by milessabin. ........ r20933 | extempore | 2010-02-20 06:38:34 +0100 (Sat, 20 Feb 2010) | 3 lines Some cleanups on the scalacfork ant task since I'm clearly going to have to go through everything which touches classpaths in any way shape or form. No review. ........ r20934 | extempore | 2010-02-20 07:48:29 +0100 (Sat, 20 Feb 2010) | 3 lines Altered the ant task to generate a -Dscala.home= property, which now acts as signal to scalac to ignore the java classpath and use only the scala classpath. No review. ........ r20935 | extempore | 2010-02-20 21:00:46 +0100 (Sat, 20 Feb 2010) | 2 lines Band-aid for #3081, issue should receive more comprehensive treatment. Review by imaier. ........ r20936 | milessabin | 2010-02-20 21:32:40 +0100 (Sat, 20 Feb 2010) | 1 line Another attempt at retaining ill-typed trees for the IDE, this time without breaking scaladoc. Review by extempore. ........ r20937 | extempore | 2010-02-20 22:16:49 +0100 (Sat, 20 Feb 2010) | 3 lines Having some challenges confirming the validity of the bootstrap process given starr's slightly dated classpath code, so this is a new starr based on r20934. No review. ........ r20938 | extempore | 2010-02-20 23:50:20 +0100 (Sat, 20 Feb 2010) | 4 lines Lowering the noise level in the classpath debugging output. Try ant -Dscalac.args="-Ylog-classpath" if you would like the rare joy of having a fair idea what is being used to compile what. No review. ........ r20941 | extempore | 2010-02-21 06:01:55 +0100 (Sun, 21 Feb 2010) | 1 line Some repl cleanups and debugging conveniences. No review. ........ r20942 | extempore | 2010-02-21 22:56:56 +0100 (Sun, 21 Feb 2010) | 2 lines Some more code for seeing what's going on in in scalac's mind with respect to who to load when and from where. No review. ........ r20944 | extempore | 2010-02-22 01:15:32 +0100 (Mon, 22 Feb 2010) | 2 lines More laboring on Settings, ClassPaths, Ant Tasks, Partest, and similar epicenters of thrilldom. No review. ........ r20945 | extempore | 2010-02-22 01:16:03 +0100 (Mon, 22 Feb 2010) | 1 line Some deprecation patrol and minor cleanups. No review. ........ r20949 | dragos | 2010-02-22 14:11:49 +0100 (Mon, 22 Feb 2010) | 1 line Specialized types are not substituted inside type arguments. Closes #3085, no review. ........ r20950 | phaller | 2010-02-22 18:43:40 +0100 (Mon, 22 Feb 2010) | 4 lines - Added fair mode to ForkJoinScheduler, which submits tasks to global queue with a 2% chance - Reactor uses ForkJoinScheduler by default - Moved ActorGC to scheduler package - Various clean-ups ........ r20951 | extempore | 2010-02-22 22:47:41 +0100 (Mon, 22 Feb 2010) | 3 lines A simple command line parser to consolidate a bunch of different implementations floating around trunk. And rolled it out in partest's ConsoleRunner. Review by community. ........ r20952 | milessabin | 2010-02-23 00:39:10 +0100 (Tue, 23 Feb 2010) | 1 line Yet another attempt at retaining ill-typed trees for the IDE ... this time docs.comp should be unbroken. ........ r20953 | extempore | 2010-02-23 01:27:39 +0100 (Tue, 23 Feb 2010) | 5 lines Some much needed housecleaning regarding system properties. If you can possibly resist the temptation, it'd be great if people could try to go through the properties classes to get and set them, and also to set property values somewhere fixed rather than using strings directly. Review by community. ........ r20954 | extempore | 2010-02-23 04:49:01 +0100 (Tue, 23 Feb 2010) | 2 lines More fun with ClassPaths. Could that be the home stretch I see? Review by community. ........ r20955 | extempore | 2010-02-23 05:42:53 +0100 (Tue, 23 Feb 2010) | 1 line Oops, I broke jline in r20953. Fixed. No review. ........ r20956 | extempore | 2010-02-23 07:57:53 +0100 (Tue, 23 Feb 2010) | 2 lines Abstracting out a few more platform specific elements now that we have a facility for doing so. Review by rytz. ........ r20957 | extempore | 2010-02-23 09:21:00 +0100 (Tue, 23 Feb 2010) | 1 line Introducing partest to the idea of code reuse. No review. ........ r20958 | extempore | 2010-02-23 09:32:53 +0100 (Tue, 23 Feb 2010) | 11 lines Added productElementName to Product. Now you can access all the case class field names your heart desires. Review by odersky. scala> case class Foo[T](kilroy: String, burma: List[T], shave: Seq[Int]) defined class Foo scala> Foo("was here", List('myanmar), Seq(25, 25)) res0: Foo[Symbol] = Foo(was here,List('myanmar),List(25, 25)) scala> 0 to 2 map (res0 productElementName _) res1: IndexedSeq[String] = IndexedSeq(kilroy, burma, shave) ........ r20959 | imaier | 2010-02-23 12:18:34 +0100 (Tue, 23 Feb 2010) | 1 line Added TextComponent.paste, made some accidentially public publisher methods protected. ........ r20960 | imaier | 2010-02-23 12:33:50 +0100 (Tue, 23 Feb 2010) | 1 line Fix for #3084 ........ r20961 | dubochet | 2010-02-23 15:19:20 +0100 (Tue, 23 Feb 2010) | 1 line [scaladoc] Updated jQuery to version 1.4.2. No review. ........ r20962 | phaller | 2010-02-23 18:34:50 +0100 (Tue, 23 Feb 2010) | 1 line Control-flow combinators do not require thread-local variable in Reactor. Review by plocinic. ........ r20963 | extempore | 2010-02-23 19:15:37 +0100 (Tue, 23 Feb 2010) | 3 lines After the compiler refactor, we ended up with identical copies of PickleBuffer and PickleFormat in the library and compiler. Deleted the compiler versions and updated references. No review. ........ r20964 | extempore | 2010-02-23 19:15:59 +0100 (Tue, 23 Feb 2010) | 1 line Removed now redundant function splitParams. No review. ........ r20965 | prokopec | 2010-02-23 19:57:52 +0100 (Tue, 23 Feb 2010) | 1 line Fixes #3018. Review by extempore. ........ r20966 | phaller | 2010-02-23 20:36:38 +0100 (Tue, 23 Feb 2010) | 1 line Fixed tests to unbreak build. No review. ........ r20967 | extempore | 2010-02-23 22:16:51 +0100 (Tue, 23 Feb 2010) | 4 lines The initial results of running a basic cut and paste detector over trunk and trying to undo some of it. I can live with history but if I see Cutty McPastington in new commits I will get all finger waggly. Don't make me cross that ocean. No review. ........ r20968 | milessabin | 2010-02-23 22:59:14 +0100 (Tue, 23 Feb 2010) | 1 line Improved fix for #2767. Review by community. ........ r20969 | extempore | 2010-02-23 23:15:38 +0100 (Tue, 23 Feb 2010) | 1 line Fixed a little command line partest bug I introduced. No review. ........ r20970 | milessabin | 2010-02-23 23:34:17 +0100 (Tue, 23 Feb 2010) | 1 line Removed stray debugging code. ........ r20972 | extempore | 2010-02-24 02:59:44 +0100 (Wed, 24 Feb 2010) | 1 line Another update for ShowPickled. No review. ........ r20973 | extempore | 2010-02-24 03:27:03 +0100 (Wed, 24 Feb 2010) | 6 lines Updated scalacheck jar to current trunk. Tracked down why it's not being used. Updated partest with a --scalacheck option. Added scalacheck tests to the ant build target. Still struggling with ant/partest/classpaths so it's not on by default yet, but at least ./partest --scalacheck works. We... will... use... scalacheck. And we will like it! No review. ........ r20974 | phaller | 2010-02-24 13:18:45 +0100 (Wed, 24 Feb 2010) | 1 line Fixed problem with daemon actor termination. No review. ........ r20976 | dubochet | 2010-02-24 16:51:10 +0100 (Wed, 24 Feb 2010) | 1 line [scaladoc] Preliminary support for links and lists in wiki syntax, as well as printing of lists. Fix to display of "inherits" classes. Contributed by Pedro Furlanetto, no review. Member names in signature are more contrasted and are aligned. ........ r20977 | prokopec | 2010-02-24 17:55:31 +0100 (Wed, 24 Feb 2010) | 1 line Fixes #3088. No review. ........ r20978 | extempore | 2010-02-25 00:17:30 +0100 (Thu, 25 Feb 2010) | 22 lines Bash completion! The file is automatically created as part of the build process and placed in $pack/etc. % scala -Xprint -Xprint-icode -Xprint-pos -Xprint-types -Xprint: % scala -Xprint: all flatten mixin tailcalls cleanup icode namer terminal closelim inliner packageobjects typer constructors jvm parser uncurry dce lambdalift pickler erasure lazyvals refchecks explicitouter liftcode superaccessors % scala -g: line none notailcalls source vars % scala -Ystruct-dispatch: invoke-dynamic mono-cache no-cache poly-cache Review by community. ........ r20979 | extempore | 2010-02-25 01:14:04 +0100 (Thu, 25 Feb 2010) | 4 lines The build file wasn't quite all the way on the bash completion commit. Now it should work, and also be copied into the distribution. Review by anyone who may be cruel enough to oppose including completion in the distribution. ........ r20980 | extempore | 2010-02-25 02:35:46 +0100 (Thu, 25 Feb 2010) | 8 lines Created -Yfatal-warnings, as otherwise I can't see how to make partest fail on the presence of an unchecked warning. It'll come in handy anyway. Now we have a real tough guy's command line for those who want it: % scalac -unchecked -deprecation -Yfatal-warnings `find . -name '*.scala'` Not for the timid. Review by community. ........ r20981 | extempore | 2010-02-25 03:36:06 +0100 (Thu, 25 Feb 2010) | 1 line Fix for a king-sized last-minute thinko. No review. ........ r20982 | extempore | 2010-02-25 03:53:35 +0100 (Thu, 25 Feb 2010) | 3 lines Some debugging code for partest. If --debug is given it collects timings on all the individual tests and prints it sorted by glacialness. Review by community. ........ r20983 | extempore | 2010-02-25 03:54:04 +0100 (Thu, 25 Feb 2010) | 4 lines Tweaking the sealed logic in light of #3097. Closes #3097. Reorganizes children a little so they always come back sorted the same way the pickler does. Taking advantage of -Yfatal-warnings in the test case. Review by community. ........ r20984 | extempore | 2010-02-25 04:15:34 +0100 (Thu, 25 Feb 2010) | 1 line Trying to fix svn props on scalacheck jar. No review. ........ r20985 | extempore | 2010-02-25 07:08:27 +0100 (Thu, 25 Feb 2010) | 5 lines It turns out some of the weirdness lately is because changes to classpath handling have a way of not taking effect until they're installed via starr, and presently we have a starr with different logic than trunk. No choice but to roll up one more starr based on r20984. No review. ........ r20986 | prokopec | 2010-02-25 16:16:24 +0100 (Thu, 25 Feb 2010) | 1 line Fixes #3088. No review. ........ r20987 | dragos | 2010-02-25 16:53:13 +0100 (Thu, 25 Feb 2010) | 1 line Made the squeezer worthy of its name: a block with an empty list of stats and another block as the result expression is rewritten to the inner block (recursively). This makes the output from the pattern matcher nicer, eliminating loads of empty nested blocks. Review by extempore. ........ r20988 | phaller | 2010-02-25 17:20:25 +0100 (Thu, 25 Feb 2010) | 1 line Physically moved ActorGC to scheduler directory. ........ r20989 | phaller | 2010-02-25 17:26:05 +0100 (Thu, 25 Feb 2010) | 1 line Made doc comment consistent. No review. ........ r20990 | extempore | 2010-02-25 19:24:58 +0100 (Thu, 25 Feb 2010) | 2 lines Working on making the bootstrap process transparent and consistent. And removed a bunch of what is now cruft in partest. No review. ........ r20991 | extempore | 2010-02-25 20:49:25 +0100 (Thu, 25 Feb 2010) | 3 lines Looking at iulian's patch to the squeezer sent me off looking at equalsStructure, which clearly was written in a bygone era. Rewritten with modern tools. No review. ........ r20992 | extempore | 2010-02-25 20:50:28 +0100 (Thu, 25 Feb 2010) | 2 lines More partest cleanups, and putting back a couple lines in build.xml which were left a little too commented out. No review. ........ r20993 | extempore | 2010-02-25 21:50:43 +0100 (Thu, 25 Feb 2010) | 2 lines What appears to be a workaround for #3082, which I am hitting literally 20 times a day. Will detail in ticket. Review by odersky. ........ r20994 | extempore | 2010-02-25 22:20:59 +0100 (Thu, 25 Feb 2010) | 2 lines More return type annotation to work around my other frequent guest in the world of #3082-connected pickler bugs. No review. ........ r20995 | extempore | 2010-02-25 23:09:37 +0100 (Thu, 25 Feb 2010) | 2 lines Added a --bare option to ShowPickled so I can diff signatures without all the explicit indices blowing any points of similarity. No review. ........ r20996 | moors | 2010-02-26 10:16:22 +0100 (Fri, 26 Feb 2010) | 5 lines closes #2797 -- no review (already done in ticket by Martin) 1) isHigherKindedType is now false for singletontype 2) toInstance recurses when pre is a typevar: the problem is that pre.widen.typeSymbol isNonBottomSubClass symclazz is true while pre.baseType(symclazz) is NoType ........ r20997 | moors | 2010-02-26 10:42:50 +0100 (Fri, 26 Feb 2010) | 4 lines closes #2956 the problem was that corresponds on Seq's does not check length of sequences before testing the predicate, whereas in some cases that predicate relied on this invariant (when it was doing substitution) ........ r20998 | moors | 2010-02-26 11:00:49 +0100 (Fri, 26 Feb 2010) | 1 line see #2634: updated docs to indicate zipped is strict ........ r20999 | moors | 2010-02-26 11:50:32 +0100 (Fri, 26 Feb 2010) | 2 lines closes #2421 -- now also deals with chained implicits no review ........ r21000 | moors | 2010-02-26 12:04:02 +0100 (Fri, 26 Feb 2010) | 2 lines closes #2741 closes #3079 no review worksforme ........ r21001 | rytz | 2010-02-26 13:49:02 +0100 (Fri, 26 Feb 2010) | 1 line close #3071. look at owner.info only later in type completer, not during namer. review by odersky ........ r21003 | extempore | 2010-02-26 18:03:22 +0100 (Fri, 26 Feb 2010) | 3 lines Undid my awful code which had broken the thread scheduler selection. Further unbroke it beyond that unbreaking hopefully to the point where java 1.6 on OSX is recognized as such. Review by haller. ........ r21004 | odersky | 2010-02-26 19:31:35 +0100 (Fri, 26 Feb 2010) | 1 line closes #3082, review by rytz ........ r21005 | extempore | 2010-02-26 23:24:46 +0100 (Fri, 26 Feb 2010) | 14 lines Quite a lot more work on XML equality than I can properly justify spending time on, but you know how it is once you get started on something. This commit will likely break some code out there in the world but this is impossible to avoid if we are to achieve sane equality in trunk. For anyone who was relying upon the 2.7 equality behavior for scala.xml.* classes, using "xml_==" instead of "==" for comparisons will restore the old behavior. The standard == on xml elements now attempts to behave in such a way that symmetry and hashCode contracts will be preserved. It's probably not 100% there yet, but I can tell you this: it is closer today than it was yesterday. Review by community. ........ r21006 | extempore | 2010-02-27 04:46:48 +0100 (Sat, 27 Feb 2010) | 17 lines Removed the partest restriction that individual files must be in the same set. Haven't you always wanted to do this? Now you can. Review by phaller. % ./partest `ack --type=scala -l HashSet | head -6` Testing individual files testing: [...]/files/jvm/serialization.scala [ OK ] testing: [...]/files/jvm/t1600.scala [ OK ] Testing individual files testing: [...]/files/pos/collections.scala [ OK ] testing: [...]/files/pos/t2183.scala [ OK ] Testing individual files testing: [...]/files/run/bug1074.scala [ OK ] testing: [...]/files/run/bug2512.scala [ OK ] ........ r21007 | extempore | 2010-02-27 06:47:03 +0100 (Sat, 27 Feb 2010) | 1 line Some library reorganization I discussed with martin. No review. ........ r21008 | extempore | 2010-02-27 21:10:48 +0100 (Sat, 27 Feb 2010) | 2 lines Special cased an error message for the common situation of calling AnyRef methods on Any or AnyVal. Review by odersky. ........ r21009 | extempore | 2010-02-28 00:49:51 +0100 (Sun, 28 Feb 2010) | 2 lines Expanded the check from #1392 to enclose #3123 as well so that "case Int => " doesn't crash. Closes #3123. Review by odersky. ........ r21010 | extempore | 2010-02-28 01:30:21 +0100 (Sun, 28 Feb 2010) | 13 lines While working on Any.## I ran across some interesting tests being made in TreeBuilder: val buf = new ListBuffer[(Name, Tree, Position)] [...] if (buf.iterator forall (name !=)) ... This is always true because a Name will never equal a Tuple3. Oh universal equality, will you never tire of toying with us? Given that this bug has existed since r12886 one might reasonably question the necessity of the conditionals falling prey to this, but leaving that for another day, it should at least check what it's trying to check. No review. ........ r21011 | extempore | 2010-02-28 15:35:50 +0100 (Sun, 28 Feb 2010) | 10 lines Added ## method to Any as our scala hashCode method which provides consistent answers for primitive types. And I'm sure we're all tired of new starrs, but it's hard to add a method to Any without one. This patch only brings ## into existence, but nothing calls it yet. // some true assertions scala> assert(5.5f.## == 5.5f.hashCode) scala> assert(5.0f.## != 5.0f.hashCode && 5.0f.## == 5L.##) No review. (Already reviewed by odersky.) ........ r21012 | extempore | 2010-02-28 17:52:03 +0100 (Sun, 28 Feb 2010) | 2 lines Modification to r21009 to preserve that classic invariant, (x || !x) && !(x && !x). No review. ........ r21013 | dragos | 2010-02-28 22:09:06 +0100 (Sun, 28 Feb 2010) | 2 lines Fixed specialized pattern matches. Incompatible cases are removed from specialized implementations. ........ r21014 | extempore | 2010-03-01 06:59:11 +0100 (Mon, 01 Mar 2010) | 6 lines Enabled scalacheck tests. Renamed the super confusing and what must be legacy scalatest.* properties to partest.*, boldly assuming that the fact that partest is pretty much unusable outside of scalac means there are no users outside of scalac who might be disrupted by eliminating old property names. Review by community. ........ r21015 | ilyas | 2010-03-01 11:25:59 +0100 (Mon, 01 Mar 2010) | 1 line Minor printer fix for singleton types ........ r21016 | odersky | 2010-03-01 11:59:46 +0100 (Mon, 01 Mar 2010) | 1 line Added one previously overlooked case for computing the right tparams of glbs of polytypes. This is a postscript to the fix of #3082. ........ r21017 | odersky | 2010-03-01 12:00:15 +0100 (Mon, 01 Mar 2010) | 1 line closed #3101. Review by community. ........ r21018 | odersky | 2010-03-01 12:02:12 +0100 (Mon, 01 Mar 2010) | 1 line Following a suggestion of jrudolph, made filterKeys and mapValues transform abstract maps, and duplicated functionality for immutable maps. Moved transform and filterNot from immutable to general maps. Review by phaller. ........ r21019 | odersky | 2010-03-01 12:11:25 +0100 (Mon, 01 Mar 2010) | 1 line Closes #3076. Review by community. ........ r21020 | ilyas | 2010-03-01 15:17:44 +0100 (Mon, 01 Mar 2010) | 1 line #3060 fixed ........ r21021 | odersky | 2010-03-01 15:35:58 +0100 (Mon, 01 Mar 2010) | 1 line Added missing file that broke the build. ........ r21022 | ilyas | 2010-03-01 15:51:38 +0100 (Mon, 01 Mar 2010) | 1 line #2885 fixed ........ r21023 | ilyas | 2010-03-01 15:52:00 +0100 (Mon, 01 Mar 2010) | 1 line typo in test fixed ........ r21024 | extempore | 2010-03-01 17:37:01 +0100 (Mon, 01 Mar 2010) | 3 lines Undeprecated Function.tupled based on the type inference issues documented at: http://stackoverflow.com/questions/2354277/function-tupled-and-placehold er-syntax We should revisit if anon function inference improves. Review by community. ........ r21027 | ilyas | 2010-03-01 19:37:55 +0100 (Mon, 01 Mar 2010) | 1 line scalap tests fixed ........ r21028 | ilyas | 2010-03-01 20:27:20 +0100 (Mon, 01 Mar 2010) | 1 line trailing spaces in decompiled annotations are trimmed ........ r21029 | extempore | 2010-03-01 20:47:48 +0100 (Mon, 01 Mar 2010) | 6 lines Whipped ShowPickled until it would print out private[scope] from the signature, and infrastructure created along the way. Only now at this late hour do I realize that this work would be a lot better aimed at creating a fake Universe and then adapting UnPickler.Scan so you can reuse the real logic. My advice to the next guy: do that instead. No review. ........ r21031 | ilyas | 2010-03-02 02:02:55 +0100 (Tue, 02 Mar 2010) | 1 line #3128 fixed ........ r21043 | extempore | 2010-03-02 20:44:01 +0100 (Tue, 02 Mar 2010) | 16 lines Improved equality for Manifests. Implements symmetry via canEquals, and has ClassManifests compare according to erasure but full manifests also compare type arguments. Preserving symmetry means that some things you might expect to be equal are not: val m1 = scala.reflect.ClassManifest.fromClass(classOf[List[String]]) val m2 = manifest[List[String]] (m1 == m2) // false However you can always compare the erasures. (m1.erasure == m2.erasure) // true Review by dpp. ........ r21044 | extempore | 2010-03-02 21:28:45 +0100 (Tue, 02 Mar 2010) | 2 lines Removed the symlinks between scalacheck jars to satisfy windows. Tweaked partest to accomodate. No review. ........ r21045 | extempore | 2010-03-02 23:28:45 +0100 (Tue, 02 Mar 2010) | 6 lines Added --grep command line option to partest. If you want to run every test with the string "Manifest" in the source file, you may now do: ./partest --grep Manifest No review. ........ r21053 | odersky | 2010-03-03 18:33:51 +0100 (Wed, 03 Mar 2010) | 1 line Closes #3130. No review necessary. ........ r21054 | odersky | 2010-03-03 18:36:42 +0100 (Wed, 03 Mar 2010) | 1 line Attempt to fix the typing-a-whileloop problem. ........ r21058 | extempore | 2010-03-04 06:22:57 +0100 (Thu, 04 Mar 2010) | 7 lines A few yards short of the goal posts attempt at making our usage of Throwable subclasses more consistent. This patch eliminates a lot of ad hoc Exception/Error/etc. creation and various arbitrary choices are rendered slightly less arbitrary. From now on let's try not to use the word "Exception" or "Error" in the names of Throwable subclasses unless they actually derive (and make sense to derive) from Exception or Error. Review by community. ........ r21059 | extempore | 2010-03-04 06:23:21 +0100 (Thu, 04 Mar 2010) | 2 lines Obeyed source comment to make some classes private, since the problem described seems to be gone. No review. ........ r21060 | phaller | 2010-03-04 12:40:16 +0100 (Thu, 04 Mar 2010) | 1 line Clean-ups in scheduler hierarchy. Restricted visibility of several traits. Added tests exercising cleaned-up interface. ........ r21062 | rompf | 2010-03-04 14:17:39 +0100 (Thu, 04 Mar 2010) | 1 line fixed while loop performance. label def rhs re-check only done if first typing produces a different type than the initially assumed one (which is alsways Unit, so the recheck only happens if some AnnotationChecker says the type is not Unit but Unit @something). ........ r21063 | phaller | 2010-03-04 14:28:12 +0100 (Thu, 04 Mar 2010) | 1 line Fixed actors.enableForkJoin property. Fixed build. Review by extempore. ........ r21064 | odersky | 2010-03-04 15:06:18 +0100 (Thu, 04 Mar 2010) | 1 line Closes #3118. review by extempore ........ r21065 | phaller | 2010-03-04 16:05:47 +0100 (Thu, 04 Mar 2010) | 1 line Removed obsolete SimpleExecutorScheduler, ThreadPoolScheduler, DefaultThreadPoolScheduler, and SchedulerService. Made ThreadPoolConfig private. No review necessary. ........ r21066 | extempore | 2010-03-04 19:05:39 +0100 (Thu, 04 Mar 2010) | 4 lines Added a comment to Symbols after one too many times forgetting what I was in that file for while I traced which of the linked* functions I wanted. Review by odersky (only because there's also a renaming proposal in there for which I solicit your yea or nay.) ........ r21067 | extempore | 2010-03-04 21:18:41 +0100 (Thu, 04 Mar 2010) | 2 lines Renamed the linkedFooOfBar methods in Symbol to be internally consistent and in line with modern nomenclature. No review. ........ r21068 | extempore | 2010-03-04 22:06:08 +0100 (Thu, 04 Mar 2010) | 2 lines Making sure the interpreter always uses the designated output stream rather than unwittingly falling back on predef. No review. ........ r21071 | extempore | 2010-03-05 07:01:19 +0100 (Fri, 05 Mar 2010) | 8 lines Added -Xmigration option and @migration annotation. At present it will warn about the following changes from 2.7 to 2.8: Stack iterator order reversed mutable.Set.map returns a Set and thus discards duplicates A case 'x @ Pattern' matches differently than 'Pattern' Review by odersky. ........ r21073 | extempore | 2010-03-05 15:30:44 +0100 (Fri, 05 Mar 2010) | 2 lines Removed quotes from quoted tokens in command line parser to soothe Windows. Review by community. ........ r21074 | odersky | 2010-03-05 16:31:40 +0100 (Fri, 05 Mar 2010) | 1 line Closes #3037. Review by extempore. ........ r21075 | odersky | 2010-03-05 16:32:03 +0100 (Fri, 05 Mar 2010) | 1 line Closes #3026. Review by extempore. ........ r21076 | odersky | 2010-03-05 16:32:24 +0100 (Fri, 05 Mar 2010) | 1 line Mixing test case. No review. ........ r21077 | odersky | 2010-03-05 16:58:27 +0100 (Fri, 05 Mar 2010) | 1 line Closes #3015. Review by moors (it's his patch). ........ r21078 | extempore | 2010-03-05 19:17:26 +0100 (Fri, 05 Mar 2010) | 1 line Cleaning up some redundancy martin noticed. No review. ........ r21079 | extempore | 2010-03-05 20:19:17 +0100 (Fri, 05 Mar 2010) | 2 lines ScalaRunTime method to perform sameElements as fix for #2867. Review by odersky. ........ r21080 | extempore | 2010-03-05 21:14:13 +0100 (Fri, 05 Mar 2010) | 1 line Test case for case class equality. Closes #1332. No review. ........ r21081 | extempore | 2010-03-05 22:06:58 +0100 (Fri, 05 Mar 2010) | 5 lines Fix for #3136 by reverting the line in r18184 which caused this and other regressions. The downside is that the #1697 test case no longer passes, but protracted shrug because it wasn't entirely fixed anyway. Review by moors. (Can you triangulate your way to a patch where both work simultaneously? It's today's bonus challenge!) ........ r21083 | dcaoyuan | 2010-03-06 05:54:15 +0100 (Sat, 06 Mar 2010) | 1 line escape source file path with space chars ........ r21084 | extempore | 2010-03-06 14:26:30 +0100 (Sat, 06 Mar 2010) | 3 lines Fixes for #3126. Case class unapply methods now guard against null, and thrown MatchErrors don't NPE trying to stringify null. No review. ........ r21085 | extempore | 2010-03-06 14:45:23 +0100 (Sat, 06 Mar 2010) | 2 lines Modification to the last patch to return None/false rather than throwing the MatchError. No review. ........ r21086 | extempore | 2010-03-07 06:24:00 +0100 (Sun, 07 Mar 2010) | 5 lines One minute too many trying to figure out where some partest classpath mutation was disappearing on me, and I snapped and started the process of creating an immutable Settings. This commit is for the most part infrastructure to enable its smooth and uneventful entrance. Review by community. ........ r21087 | extempore | 2010-03-07 07:02:38 +0100 (Sun, 07 Mar 2010) | 2 lines Still working my way through all the classpath manipulations in partest. No review. ........ r21088 | extempore | 2010-03-07 19:53:07 +0100 (Sun, 07 Mar 2010) | 2 lines Removed unnecessary DebugSetting, folding the small extra functionality back into ChoiceSetting. No review. ........ r21091 | rompf | 2010-03-07 21:52:45 +0100 (Sun, 07 Mar 2010) | 4 lines - new immutable HashMap implementation based on a hash trie. this is the first iteration, more optimizations will be added later. - updated test cases to reflect new ordering of elements - made Map.empty and Set.empty singletons, deprecating classes Map.EmptyMap and Set.EmptySet Review by extempore, odersky. ........ r21092 | extempore | 2010-03-08 07:06:36 +0100 (Mon, 08 Mar 2010) | 2 lines More progress toward immutable Settings, and various cleanups encountered along the way. No review. ........ r21093 | extempore | 2010-03-08 07:24:23 +0100 (Mon, 08 Mar 2010) | 6 lines Created directory for code which is most likely dead but we want to keep around a while in case someone else is using it. It's called src/attic and now it holds four files. Motivations: such files cloud my attempts to figure out what code in the compiler is really being used, they require effort to maintain across changes, and they slow down every build a fraction. Revew by community. ........ r21094 | milessabin | 2010-03-08 11:20:05 +0100 (Mon, 08 Mar 2010) | 1 line Unbreak the IDE build following [21086]. No review. ........ r21098 | dragos | 2010-03-08 14:14:59 +0100 (Mon, 08 Mar 2010) | 1 line Recursively transform 'new' arguments in specialized programs. Closes #3149, no review. ........ r21099 | odersky | 2010-03-08 14:34:57 +0100 (Mon, 08 Mar 2010) | 1 line Closes #3026. Review by extempore. ........ r21100 | odersky | 2010-03-08 14:40:32 +0100 (Mon, 08 Mar 2010) | 1 line Closes #3115. Reviw by rytz ........ r21101 | odersky | 2010-03-08 14:43:12 +0100 (Mon, 08 Mar 2010) | 1 line Closes #3006. Review by milessabin because this could be a good basis for shwoing implicits in the IDE. ........ r21102 | odersky | 2010-03-08 14:45:36 +0100 (Mon, 08 Mar 2010) | 1 line Avoids two unchecked warnings. Review by extempore. ........ r21103 | odersky | 2010-03-08 14:46:13 +0100 (Mon, 08 Mar 2010) | 1 line new tests ........ r21105 | moors | 2010-03-08 15:27:48 +0100 (Mon, 08 Mar 2010) | 4 lines closes #2994 make normalize slightly more aggressive in loading symbol info, while tolerating the righteous cycle (use sym.info.typeParameters instead of unsafeParams) this is needed to make sure higher-kinded types have their type parameters (otherwise we'd get a PolyType with NoSymbol for typeParams) ........ r21106 | phaller | 2010-03-08 16:01:53 +0100 (Mon, 08 Mar 2010) | 1 line Reactor now has type parameter. Added Reactor.getState. Made Reactor.start idempotent. Moved Actor.reactWithin to ReplyReactor. Renamed Replyable to CanReply. ........ r21107 | odersky | 2010-03-08 19:12:11 +0100 (Mon, 08 Mar 2010) | 1 line Refined fix for #2946. Review by extempore. ........ r21108 | phaller | 2010-03-08 19:17:50 +0100 (Mon, 08 Mar 2010) | 1 line Closes #3102. ........ r21109 | extempore | 2010-03-08 19:46:35 +0100 (Mon, 08 Mar 2010) | 2 lines Changed partest ant task not to use reflection, instead using the path with which scala was invoked. No review. ........ r21110 | extempore | 2010-03-08 19:57:01 +0100 (Mon, 08 Mar 2010) | 2 lines Fixed failing test t3115 via judicious application of -Yfatal-warnings. No review. ........ r21111 | extempore | 2010-03-08 20:58:10 +0100 (Mon, 08 Mar 2010) | 8 lines Added test.debug target to build.xml. This will run whatever tests you have placed in the test/debug directories - critically for those of us stuck debugging ant, this lets one run a small selection of tests by way of ant instead of the console runner. (Sorry about the empty .gitignore files, but one of git's quirks is that it won't acknowledge the existence of an empty directory.) No review. ........ r21113 | extempore | 2010-03-09 01:46:20 +0100 (Tue, 09 Mar 2010) | 2 lines Temporarily disabling failing test until I can finish my partest work. No review. ........ r21115 | phaller | 2010-03-09 11:35:13 +0100 (Tue, 09 Mar 2010) | 1 line Made actor-getstate test more robust. No review. ........ r21121 | phaller | 2010-03-09 15:03:02 +0100 (Tue, 09 Mar 2010) | 1 line New attempt at fixing the tests. No review. ........ r21123 | extempore | 2010-03-10 16:33:14 +0100 (Wed, 10 Mar 2010) | 1 line Some windows oriented fixes for build.xml. No review. ........ r21124 | extempore | 2010-03-10 16:57:10 +0100 (Wed, 10 Mar 2010) | 1 line Removed a couple infinite loops in XML. No review. ........ r21125 | extempore | 2010-03-10 17:21:56 +0100 (Wed, 10 Mar 2010) | 2 lines Some minor compiler support bits for my upcoming partest patch. No review. ........ r21126 | extempore | 2010-03-10 17:30:51 +0100 (Wed, 10 Mar 2010) | 2 lines ...and a line from partest I didn't notice the absence of which would break the build. No review. ........ r21127 | dubochet | 2010-03-10 17:32:10 +0100 (Wed, 10 Mar 2010) | 1 line FatalError needs a stack trace. No review (was discussed at Scala meeting). ........ r21128 | moors | 2010-03-10 18:43:20 +0100 (Wed, 10 Mar 2010) | 5 lines closes #3152: refactored adjustTypeArgs and methTypeArgs so that tparams are correctly split into ones that were inferred successfully, and that thus have a corresponding type argument, and those that weren't determined I didn't investigate the exact cause of the final error message in the bug report, but Jason Zaugg's observations seems correct and I never liked that uninstantiated buffer in the first place. review by odersky ........ r21129 | moors | 2010-03-10 18:50:15 +0100 (Wed, 10 Mar 2010) | 1 line slight (syntactic) cleanup of patch for see #3152 -- sorry, only realised when looking over my patch again ........ r21130 | extempore | 2010-03-10 20:18:43 +0100 (Wed, 10 Mar 2010) | 2 lines Small syntactic adjustment so that last patch from adriaan will build. (Big thumsb up to the aesthetics though.) No review. ........ r21131 | extempore | 2010-03-11 07:00:37 +0100 (Thu, 11 Mar 2010) | 1 line Some IO conveniences. No review. ........ r21135 | odersky | 2010-03-11 15:21:21 +0100 (Thu, 11 Mar 2010) | 1 line Closes #2940. Review by extempore. My original idea to replace existrentialAbstraction by existentialType in ClassfileParsers was correct after all. However this change triggered another landmine in Definitions, where ClassType queried unsafeTypeParams. I think that was only needed for the migration to Java generics in 2.7, so it can safely go away now. Because the change in classfile parsers forces less of a type, unsafeTtpeParams returned the wrong result, which caused the build to fail. The modifications in Erasure and Implicits were attempts to isolate the problem before. They seem to be unnecessary to make the build go through, but are cleaner than the previous versions they replace. ........ r21136 | odersky | 2010-03-11 15:22:29 +0100 (Thu, 11 Mar 2010) | 1 line Fixed doc comment. No review. ........ r21137 | odersky | 2010-03-11 17:34:44 +0100 (Thu, 11 Mar 2010) | 1 line Closes #3157 by overriding +: in List. Review by rompf ........ r21138 | rompf | 2010-03-11 17:44:06 +0100 (Thu, 11 Mar 2010) | 1 line implemented handling of 32-bit collisions in immutable.HashMap. review by community. ........ r21139 | odersky | 2010-03-11 17:53:54 +0100 (Thu, 11 Mar 2010) | 1 line Closes #3158. No review necessary. ........ r21140 | rompf | 2010-03-11 17:55:38 +0100 (Thu, 11 Mar 2010) | 2 lines moved the continuations plugin into trunk. it is now part of the distributions under /plugins/continuations.jar which should make scalac load it by default. actual use however must be enabled by passing -P:continuations:enable as command line arg. supporting library code is in package scala.util.continuations and is compiled into scala-library.jar. review by rytz, cunei, odersky. ........ r21141 | odersky | 2010-03-11 18:11:24 +0100 (Thu, 11 Mar 2010) | 1 line Partially reverted r21018. Closes #3153. No review. ........ r21142 | rompf | 2010-03-11 21:36:43 +0100 (Thu, 11 Mar 2010) | 1 line added missing file from last commit. no review. ........ r21148 | Joshua.Suereth | 2010-03-12 14:34:05 +0100 (Fri, 12 Mar 2010) | 1 line Added continuations to maven deployment. review by rompf ........ r21149 | plocinic | 2010-03-12 16:21:25 +0100 (Fri, 12 Mar 2010) | 1 line do not set the type of the implementation method to be the type of the original one as this is done properly in cloneSymbol. no review (already done by Martin) ........ r21150 | odersky | 2010-03-12 16:38:33 +0100 (Fri, 12 Mar 2010) | 1 line Closes #3143. Review by moors. ........ r21151 | odersky | 2010-03-12 19:39:40 +0100 (Fri, 12 Mar 2010) | 1 line Added an object to mangle byte arrays into Java classfile's version of UTF8. ........ r21156 | odersky | 2010-03-13 18:32:19 +0100 (Sat, 13 Mar 2010) | 2 lines Disabled failing test ........ r21157 | odersky | 2010-03-13 18:33:33 +0100 (Sat, 13 Mar 2010) | 2 lines Closes #3120. Review by extempore. ........ r21158 | odersky | 2010-03-13 18:34:13 +0100 (Sat, 13 Mar 2010) | 2 lines Improved version where bumping and zero-encoding are rolled into one. ........ r21159 | extempore | 2010-03-13 20:24:43 +0100 (Sat, 13 Mar 2010) | 2 lines More support code for the big partest patch I'm working on to finally finish classpaths for good. No review. ........ r21160 | odersky | 2010-03-13 21:21:13 +0100 (Sat, 13 Mar 2010) | 2 lines Closes #2918. Review by moors. ........ r21162 | extempore | 2010-03-14 07:57:36 +0100 (Sun, 14 Mar 2010) | 1 line Test case closes #751. No review. ........ r21163 | extempore | 2010-03-14 07:58:02 +0100 (Sun, 14 Mar 2010) | 1 line Test case for #2940. No review. ........ r21164 | extempore | 2010-03-14 08:25:15 +0100 (Sun, 14 Mar 2010) | 1 line Tighten update check in cleanup. Closes #3175. No review. ........ r21165 | rompf | 2010-03-14 18:39:56 +0100 (Sun, 14 Mar 2010) | 1 line improved immutable HashMap iterator. review by community. ........ r21167 | extempore | 2010-03-15 05:45:47 +0100 (Mon, 15 Mar 2010) | 20 lines Leveraged -Xmigration to burn off some warts which arose in the new collections. Warnings put in place for behavioral changes, allowing the following. 1) Buffers: create new collections on ++ and -- like all the other collections. 2) Maps: eliminated never-shipped redundant method valuesIterable and supplied these return types: def keys: Iterable[A] def keysIterator: Iterator[A] def values: Iterable[B] def valuesIterator: Iterator[B] def keySet: Set[A] I concluded that keys should return Iterable because keySet also exists on Map, and is not solely in the province of Maps even if we wanted to change it: it's defined on Sorted and also appears in some Sets. So it seems sensible to have keySet return a Set and keys return the more general type. Closes #3089, #3145. Review by odersky. ........ r21168 | extempore | 2010-03-15 06:19:53 +0100 (Mon, 15 Mar 2010) | 2 lines Reverting a couple replacements from that last patch which don't look so safe on re-examination. No review. ........ r21171 | rytz | 2010-03-15 11:25:34 +0100 (Mon, 15 Mar 2010) | 1 line Fix for msil compiler. Unlike java.lang.Class, System.Type does not take a type parameter. Related to r21135. review by odersky. ........ r21174 | prokopec | 2010-03-15 11:44:27 +0100 (Mon, 15 Mar 2010) | 1 line Fixes #3155. No review is necessary. ........ r21175 | prokopec | 2010-03-15 12:03:03 +0100 (Mon, 15 Mar 2010) | 1 line Fixes #3132. No review necessary. ........ r21176 | prokopec | 2010-03-15 13:44:32 +0100 (Mon, 15 Mar 2010) | 1 line Fixes #3086. Review by community. ........ r21177 | rompf | 2010-03-15 14:44:53 +0100 (Mon, 15 Mar 2010) | 1 line new immutable.HashSet. review by community. ........ r21178 | rytz | 2010-03-15 14:54:23 +0100 (Mon, 15 Mar 2010) | 1 line minor cleanup to build.xml. review by rompf ........ r21179 | prokopec | 2010-03-15 15:45:33 +0100 (Mon, 15 Mar 2010) | 1 line Fixes #3091. Review by community. ........ r21180 | rompf | 2010-03-15 16:48:28 +0100 (Mon, 15 Mar 2010) | 1 line fixed treatment of annotated types in isNumericSubType. re-enabled test case. review by odersky ........ r21181 | extempore | 2010-03-15 18:08:16 +0100 (Mon, 15 Mar 2010) | 3 lines Tracked down docs.lib build issue from the dentist's chair while waiting for my teeth to numb. Checking in over open wireless access point. This is dedication. No review. ........ r21184 | rompf | 2010-03-16 09:19:59 +0100 (Tue, 16 Mar 2010) | 1 line added support for continuations in try/catch blocks. review by community. ........ r21186 | prokopec | 2010-03-16 10:59:37 +0100 (Tue, 16 Mar 2010) | 1 line Changed `!=` to `ne` for #3086. No review. ........ r21187 | prokopec | 2010-03-16 14:10:45 +0100 (Tue, 16 Mar 2010) | 1 line Fixes #3091. Review by community. ........ r21188 | prokopec | 2010-03-16 15:23:13 +0100 (Tue, 16 Mar 2010) | 1 line Fixes infinite streams in #3091. No review. ........ r21189 | odersky | 2010-03-16 15:40:43 +0100 (Tue, 16 Mar 2010) | 1 line Closes #3180. No review. ........ r21190 | odersky | 2010-03-16 15:42:09 +0100 (Tue, 16 Mar 2010) | 1 line Fixes nitpicks by Adriaan in his review. No review necessary. ........ r21193 | odersky | 2010-03-16 16:26:33 +0100 (Tue, 16 Mar 2010) | 2 lines Closes #2913. Review by rytz. (The error was that too few/too many argument errors had a position different from the other errors, so no second try was done for them.) ........ r21195 | odersky | 2010-03-16 17:12:46 +0100 (Tue, 16 Mar 2010) | 1 line new tests ........ r21196 | odersky | 2010-03-16 17:22:44 +0100 (Tue, 16 Mar 2010) | 1 line Closes #2688 by disallowing call-by-name implicit parameters. No review. ........ r21199 | odersky | 2010-03-16 22:07:16 +0100 (Tue, 16 Mar 2010) | 2 lines Fixed build problem by eliminiating a redundant implicit in scalap. Review by extempore. ........ r21200 | rompf | 2010-03-16 22:53:07 +0100 (Tue, 16 Mar 2010) | 1 line added test case for #2417. no review ........ r21201 | rompf | 2010-03-17 00:06:10 +0100 (Wed, 17 Mar 2010) | 1 line closes #3112. no review. ........ r21205 | phaller | 2010-03-17 12:07:50 +0100 (Wed, 17 Mar 2010) | 1 line Closes #3185. Review by plocinic. ........ r21206 | milessabin | 2010-03-17 14:16:09 +0100 (Wed, 17 Mar 2010) | 1 line Continuations support classes are included in scala-library.jar so their sources should be in scala-library-src.jar. Also export scala.util.continuations from the scala-library bundle. ........ r21207 | plocinic | 2010-03-17 17:02:42 +0100 (Wed, 17 Mar 2010) | 1 line Closes #3133. Review by community. ........ r21210 | prokopec | 2010-03-18 11:23:26 +0100 (Thu, 18 Mar 2010) | 1 line Reverse didn't work for empty ranges. Review by extempore. ........ r21214 | plocinic | 2010-03-19 10:09:00 +0100 (Fri, 19 Mar 2010) | 1 line Fixes #3054. No review. ........ r21215 | dubochet | 2010-03-19 14:29:42 +0100 (Fri, 19 Mar 2010) | 1 line Updated SVN ignore patterns. No review. ........ r21216 | phaller | 2010-03-19 18:03:14 +0100 (Fri, 19 Mar 2010) | 1 line Closes #2827. Review by community. ........ r21217 | odersky | 2010-03-19 18:35:58 +0100 (Fri, 19 Mar 2010) | 2 lines Spring cleaning of collection libraries. Closes #3117 by forcing a view when nothing else can be done. If people think some operations can be more lazy, please provide patches/do changes. Also brought proxies and forwarders into line. ........ r21218 | odersky | 2010-03-19 18:36:46 +0100 (Fri, 19 Mar 2010) | 1 line new version of decode that does not need a length. Moved test code to tests. ........ r21219 | extempore | 2010-03-19 20:38:24 +0100 (Fri, 19 Mar 2010) | 15 lines More fun with -Xmigration. Followed through on the changes to BufferLike (++ and similar now create a new collection.) Removed MapLikeBase. Annotated all the methods in mutable.{ Map, Set } which mutated in-place in 2.7 to note that they create new collections, and implemented same. At this point the only +/- like method which mutates in place which I am aware of is BufferLike.+ (see source comment for my observations.) Also tweaked some collections return types as necessitated by these changes, such as mutable.Set.clone() now returning "This" rather than mutable.Set[A]. References #3089, closes #3179. Review by odersky. ........ r21220 | milessabin | 2010-03-19 22:34:32 +0100 (Fri, 19 Mar 2010) | 1 line Added a tryToSetFromPropertyValue implementation for MultiStringSetting. ........ r21222 | extempore | 2010-03-19 22:48:42 +0100 (Fri, 19 Mar 2010) | 5 lines Returning to the thrilling world of equality and hashCodes now that Any.## is a reality. Moved the hash functions from Predef to ScalaRunTime, and made what appears to be an optimization to equals by not losing the result of an instanceof test. Review by community. ........ r21223 | extempore | 2010-03-19 23:37:13 +0100 (Fri, 19 Mar 2010) | 3 lines Half-disabled productElementName until I have time to reimplement it more to martin's liking. ("Half" because full disabling is not possible until starr has forgotten about it.) No review. ........ r21224 | extempore | 2010-03-20 05:24:39 +0100 (Sat, 20 Mar 2010) | 3 lines Some work on the Array methods as they manifest in refinement types: tightening when Array code is generated and also what code is generated. Review by dubochet. ........ r21225 | extempore | 2010-03-21 01:21:54 +0100 (Sun, 21 Mar 2010) | 6 lines Some minor changes in scala.swing.* which I was glancing through because of #3196. I noticed the Font object was in package scala instead of scala.swing, which looks sure to be a mistake (an easy one to make, and one others have made as well, because we're not entirely used to package objects.) I didn't want to accidentally ship a scala.Font so I moved it into swing. Review by imaier. ........ r21226 | extempore | 2010-03-21 01:22:22 +0100 (Sun, 21 Mar 2010) | 5 lines During my last look at r21224 I noticed what must be a long standing bug in Array.update handling. Fixing this probably never to be noticed corner case (see bug3175.scala) seduced me into drumming out some duplication. At least we got some nice commenting out of it. Review by dubochet. ........ r21227 | extempore | 2010-03-22 00:00:14 +0100 (Mon, 22 Mar 2010) | 1 line Some support code related to partest changes. No review. ........ r21228 | rompf | 2010-03-22 17:52:10 +0100 (Mon, 22 Mar 2010) | 1 line closes #3199. review by community. ........ r21229 | extempore | 2010-03-22 23:09:58 +0100 (Mon, 22 Mar 2010) | 4 lines Consistency work on Addable and Growable. Deprecated '+' on all Seq-derived classes. Creating GrowingBuilder to complement AddingBuilder on classes with += but not +. Fixed some inconsistencies I came across in the process. No review. ........ r21230 | extempore | 2010-03-23 00:02:50 +0100 (Tue, 23 Mar 2010) | 5 lines Noticed we still have a bunch of collection classes which are rather lacking. Did some integration, added some companion objects. Not thrilled with the overall picture in there, there's still a lot which should be done. Updated a deprecation message, closes #3202. No review. ........ r21231 | extempore | 2010-03-23 05:17:59 +0100 (Tue, 23 Mar 2010) | 17 lines Went ahead and implemented classpaths as described in email to scala-internals on the theory that at this point I must know what I'm doing. ** PUBLIC SERVICE ANNOUNCEMENT ** If your code of whatever kind stopped working with this commit (most likely the error is something like "object scala not found") you can get it working again with either of: passing -usejavacp on the command line set system property "scala.usejavacp" to "true" Either of these will alert scala that you want the java application classpath to be utilized by scala as well. Review by community. ........ r21232 | extempore | 2010-03-23 07:08:55 +0100 (Tue, 23 Mar 2010) | 4 lines Fix for #3204. This is a really good example of the issues that can arise when return types of public facing methods are inferred. We eventually need some mechanism to make such issues easier to avoid. No review. ........ r21233 | extempore | 2010-03-23 07:26:08 +0100 (Tue, 23 Mar 2010) | 3 lines You try to get away with one little line of uncompiled patch... reverting last patch since I'm too tired to see why it broke the build. No review. ........ r21234 | dubochet | 2010-03-23 15:38:11 +0100 (Tue, 23 Mar 2010) | 7 lines Scala signature is generated as an annotation (that is accessible through Java reflection). - compiler generates all pickled Scala signatures as annotations to class files. - compiler can read class files with signature as annotations or old-style signatures as attributes. - Scalap has also been updated to new signatures (contributed by Ilya Sergey: thanks a lot). - FJBG updated to allow entering constant pool strings as byte arrays. - ByteCodecs decode method returns the length of the decoded array. Review by ilyas. Already mostly reviewed by odersky. ........ r21235 | phaller | 2010-03-23 16:11:05 +0100 (Tue, 23 Mar 2010) | 1 line Fixes #3186. Closes #2214. ........ r21236 | phaller | 2010-03-23 16:23:39 +0100 (Tue, 23 Mar 2010) | 1 line Added test case for #3186. Closes #3186. ........ r21237 | dragos | 2010-03-23 17:29:38 +0100 (Tue, 23 Mar 2010) | 1 line Closed #3195. Review by extempore. ........ r21238 | extempore | 2010-03-23 17:52:51 +0100 (Tue, 23 Mar 2010) | 2 lines Although it was working fine, a test case for @elidable to make sure that state of affairs continues. No review. ........ r21239 | extempore | 2010-03-23 18:26:23 +0100 (Tue, 23 Mar 2010) | 2 lines Added some documentation to the methods in Predef which utilize @elidable. No review. ........ r21240 | extempore | 2010-03-23 19:51:08 +0100 (Tue, 23 Mar 2010) | 1 line Fix and test case for #3169. ........ r21241 | extempore | 2010-03-23 20:13:43 +0100 (Tue, 23 Mar 2010) | 13 lines You know Cutty McPastington is having a good time when you can find this logic in two different files: ('A' <= c && c <= 'Z') || ('a' <= c && c <= 'a') || How could that possibly work, you might ask. After a series of ||s, the last condition subsumes most of the previous ones: Character.isUnicodeIdentifierStart(c) thereby saving us from a world where the only legal lower case identifiers are a, aa, aaa, aaaa, and a few others. No review. ........ r21242 | extempore | 2010-03-23 21:16:51 +0100 (Tue, 23 Mar 2010) | 2 lines Remedied accidental obscuring of -X, -Y, and -P in the standard help output. No review. ........ r21243 | dubochet | 2010-03-23 21:23:49 +0100 (Tue, 23 Mar 2010) | 1 line Fixed build. Partial revert of r21234. All the infrastructure to read new-style signatures is still in, but the compiler again generates old-style signatures. ........ r21244 | extempore | 2010-03-23 22:58:49 +0100 (Tue, 23 Mar 2010) | 2 lines Removed ArgumentsExpander in favor of having all arguments parsed the same way. No review. ........ r21245 | phaller | 2010-03-24 10:58:24 +0100 (Wed, 24 Mar 2010) | 1 line Fixed the serialization test. Restored the test to use the semantics of Enumeration#equals in 2.7. Made caching of Enumeration objects thread safe. See #3186. Review by extempore. ........ r21246 | odersky | 2010-03-24 15:23:40 +0100 (Wed, 24 Mar 2010) | 1 line Fixes problematic equality of En umeration values. ........ r21247 | odersky | 2010-03-24 15:29:04 +0100 (Wed, 24 Mar 2010) | 1 line Fixes problematic equality of Enumeration values. ........ r21248 | odersky | 2010-03-24 15:43:45 +0100 (Wed, 24 Mar 2010) | 1 line Closes #3187. No review. ........ r21249 | extempore | 2010-03-24 16:37:50 +0100 (Wed, 24 Mar 2010) | 3 lines Reverted a presumably unintentional reincarnation of old predef (these functions are in ScalaRunTime now.) Review by odersky just in case there was a secret plan. ........ r21250 | dubochet | 2010-03-24 16:54:21 +0100 (Wed, 24 Mar 2010) | 1 line Scala signature is generated as an annotation, second try. Review by dragos. ........ r21252 | odersky | 2010-03-24 17:20:32 +0100 (Wed, 24 Mar 2010) | 1 line new readme. no review. ........ r21253 | extempore | 2010-03-24 17:23:48 +0100 (Wed, 24 Mar 2010) | 2 lines Fixed an issue with no-parameter-list methods not being elided. No review. ........ r21254 | dubochet | 2010-03-24 17:59:22 +0100 (Wed, 24 Mar 2010) | 5 lines [scaladoc] Improved Scaladoc comment syntax, contributed by Pedro Furlanetto. - Wiki syntax supports nested, numbered and unnumbered lists; - Wiki syntax supports links (entity links currently require fully qualified names); - Stars no longer are mandatory to start comment lines. Already reviewed by dubochet; no review. ........ r21256 | extempore | 2010-03-24 18:18:03 +0100 (Wed, 24 Mar 2010) | 6 lines Renamed partialMap to collect. There was a deprecated no-argument method on Iterator called collect which I had to remove, because if the method is overloaded it puts a bullet in the type inference, an intolerable result for a function which takes a partial function as its argument. I don't think there's much chance of confusion, but I put a migration warning on collect just in case. No review. ........ r21257 | phaller | 2010-03-24 18:29:59 +0100 (Wed, 24 Mar 2010) | 1 line Addresses see #2017. Documents class scala.actors.Exit. Review by community. ........ r21258 | phaller | 2010-03-24 18:31:37 +0100 (Wed, 24 Mar 2010) | 1 line Adds tests for see #2017. ........ r21259 | extempore | 2010-03-24 18:34:54 +0100 (Wed, 24 Mar 2010) | 1 line Fixed a test case I broke with the collect rename. No review. ........ r21260 | rompf | 2010-03-24 18:55:15 +0100 (Wed, 24 Mar 2010) | 1 line continuations plugin will now report a nice error message if it is not enabled and encounters an @cps expression. review by rytz ........ r21261 | extempore | 2010-03-24 20:47:41 +0100 (Wed, 24 Mar 2010) | 1 line Apparently I can't fix a test case to save my life. No review. ........ r21262 | rompf | 2010-03-24 23:43:44 +0100 (Wed, 24 Mar 2010) | 1 line reverting changes from r21260. there is a deeper problem that causes the plugin to be loaded twice but only one instance receives the enable flag (hence, the other one complains). no review ........ r21263 | phaller | 2010-03-25 09:52:08 +0100 (Thu, 25 Mar 2010) | 1 line Makes two actor tests deterministic. No review. ........ r21264 | phaller | 2010-03-25 13:46:01 +0100 (Thu, 25 Mar 2010) | 1 line Removed obsolete version numbers. No review. ........ r21265 | phaller | 2010-03-25 14:14:28 +0100 (Thu, 25 Mar 2010) | 1 line Renamed Replyable* types to *CanReply. No review. ........ r21266 | phaller | 2010-03-25 14:18:23 +0100 (Thu, 25 Mar 2010) | 1 line Renamed Replyable* source files to the types they define. No review. ........ r21269 | dragos | 2010-03-25 15:22:30 +0100 (Thu, 25 Mar 2010) | 1 line Fixed order of fields in the generated code. No review. ........ r21271 | rompf | 2010-03-25 17:14:56 +0100 (Thu, 25 Mar 2010) | 1 line fixed double-loading of plugins. reinstated not-enabled error msg for cps plugin. review by community. ........ r21273 | extempore | 2010-03-25 17:59:14 +0100 (Thu, 25 Mar 2010) | 2 lines New scalacheck jar because recent Actor changes broke binary compatibility. No review. ........ r21274 | extempore | 2010-03-25 20:55:53 +0100 (Thu, 25 Mar 2010) | 15 lines While working on partest discovered that CompilerCommand ignores half its constructor arguments and a couple dozen places blithely pass it those arguments as if they're being used. Then there were setups like this: class OfflineCompilerCommand( arguments: List[String], settings: Settings, error: String => Unit, interactive: Boolean) extends CompilerCommand(arguments, new Settings(error), error, false) Hey offline compiler command, why throw away the perfectly good settings you were given? Ever heard 'reduce, reuse, recycle'? How did you ever work... or do you? No review. ........ r21275 | odersky | 2010-03-25 21:21:45 +0100 (Thu, 25 Mar 2010) | 2 lines I think this closes #2433. Only verified by synthetic test case t2433 which crashed before and compiles now. Review by extempore. ........ r21276 | extempore | 2010-03-25 22:53:58 +0100 (Thu, 25 Mar 2010) | 7 lines Altered classpath behavior when no default is given. Now in that case the contents of environment variable CLASSPATH will be used as the scala user classpath, and only if that is not present will "." be used. Be advised that there are still various "hand assembled" sorts of classpaths in trunk, and there's not yet any way to ensure they honor this; things which use the normal Settings object should do the right thing. No review. ........ r21278 | extempore | 2010-03-26 05:26:03 +0100 (Fri, 26 Mar 2010) | 1 line Some minor I/O changes. No review. ........ r21279 | extempore | 2010-03-26 05:59:58 +0100 (Fri, 26 Mar 2010) | 2 lines Tweaked help output a little further so -Y isn't visible except to those who consider themselves advanced. No review. ........ r21280 | extempore | 2010-03-26 13:20:18 +0100 (Fri, 26 Mar 2010) | 1 line Fix for #3204. No review. ........ r21281 | dubochet | 2010-03-26 15:34:34 +0100 (Fri, 26 Mar 2010) | 1 line Unparsed Scala signature annotations are not added to the symbol table. Review by dragos. ........ r21282 | ilyas | 2010-03-26 16:53:05 +0100 (Fri, 26 Mar 2010) | 1 line missing quotes for annotation values added ........ r21283 | ilyas | 2010-03-26 17:07:39 +0100 (Fri, 26 Mar 2010) | 1 line some output polishing ........ r21284 | rompf | 2010-03-26 18:35:31 +0100 (Fri, 26 Mar 2010) | 1 line improvements to cps exception handling. among other things, finally clauses are now illegal for cps try/catch blocks. transforming them correctly is prohibitively tricky. review by community. review by community. ........ r21285 | extempore | 2010-03-27 06:41:47 +0100 (Sat, 27 Mar 2010) | 1 line TraversableOnce. Review by odersky. ........ r21286 | imaier | 2010-03-28 13:34:48 +0200 (Sun, 28 Mar 2010) | 1 line Fixed #3090 ........ r21287 | imaier | 2010-03-28 14:28:39 +0200 (Sun, 28 Mar 2010) | 1 line Fixed #2803. Added warning for UIElement.cachedWrapper. ........ r21288 | imaier | 2010-03-28 15:01:28 +0200 (Sun, 28 Mar 2010) | 1 line Fix for #2980. No review. ........ r21289 | imaier | 2010-03-28 15:15:31 +0200 (Sun, 28 Mar 2010) | 1 line Fixed #2753. No review. ........ r21290 | imaier | 2010-03-28 15:27:42 +0200 (Sun, 28 Mar 2010) | 1 line Fixed #3219. No review. ........ r21291 | imaier | 2010-03-28 15:56:18 +0200 (Sun, 28 Mar 2010) | 1 line Fixed #2242. No review. ........ r21292 | rompf | 2010-03-29 11:55:44 +0200 (Mon, 29 Mar 2010) | 1 line closes 2864. closes 2934. closes 3223. closes 3225. review by community. ........ r21294 | dubochet | 2010-03-29 14:40:39 +0200 (Mon, 29 Mar 2010) | 1 line Fix to the way Scalap decodes ScalaSignature annotations. Contributed by ilyas. Already reviewed by dubochet, no review. ........ r21295 | odersky | 2010-03-29 14:53:07 +0200 (Mon, 29 Mar 2010) | 1 line Closes #2386 by requiring class manifests for an array element type if a class manifaest for the array type is demanded. Review by dubochet. ........ r21296 | dubochet | 2010-03-29 15:14:10 +0200 (Mon, 29 Mar 2010) | 1 line Reverted file that was unintentionally committed as part of r21294. ........ r21297 | milessabin | 2010-03-29 15:38:11 +0200 (Mon, 29 Mar 2010) | 1 line Patch from Mirko Stocker to add position information to val/var modifiers on ctor params for use by tools. Review by odersky. ........ r21299 | rompf | 2010-03-29 21:22:50 +0200 (Mon, 29 Mar 2010) | 1 line fixes the unfounded "name clash between inherited members" error. review by dragos. ........ r21303 | dubochet | 2010-03-30 20:37:25 +0200 (Tue, 30 Mar 2010) | 1 line [scaladoc] Fixed the nightly build. Wiki parser correctly handles lists with unknown bullet kind. No review. ........ r21304 | extempore | 2010-03-30 23:25:16 +0200 (Tue, 30 Mar 2010) | 3 lines Noticed that the implementation of toArray Iterator had acquired via TraversableOnce called "size" to allocate the array, leaving a nice empty iterator to actually populate it. Fixed. No review. ........ r21305 | rompf | 2010-03-31 14:20:41 +0200 (Wed, 31 Mar 2010) | 1 line closes #3203, overriding more of the TraversableLike methods. also tightened access privileges to internal fields and methods. review by community. ........ r21307 | rytz | 2010-03-31 16:00:09 +0200 (Wed, 31 Mar 2010) | 1 line close #3222. review by community ........ r21309 | rytz | 2010-03-31 18:56:40 +0200 (Wed, 31 Mar 2010) | 1 line close #3183. review by community ........ r21313 | rytz | 2010-04-01 10:39:11 +0200 (Thu, 01 Apr 2010) | 1 line close #3178. review by community ........ r21322 | rompf | 2010-04-02 15:11:23 +0200 (Fri, 02 Apr 2010) | 1 line closes #3242. review by community. ........ r21323 | extempore | 2010-04-02 23:09:34 +0200 (Fri, 02 Apr 2010) | 5 lines Mostly IO tweaks related to my upcoming partest patch, which to my chagrin is being held up by windows. Also updates the default ANT_OPTS to be the same as the ones the nightlies override it with. (If we know you can't build scala with those settings it seems kind of uncool to leave them for everyone else.) No review. ........ r21324 | rompf | 2010-04-03 18:07:58 +0200 (Sat, 03 Apr 2010) | 1 line improved cps transform of partial functions. no review. ........ r21325 | dubochet | 2010-04-03 22:13:19 +0200 (Sat, 03 Apr 2010) | 1 line [scaladoc] Considerably reduced size of documentation by not generating certain strange inner classes. Scaladoc is much much faster (more than 10x on library); not exactly clear why. Protected members are printed in documentation and displayed on demand. Review by malayeri. ........ r21326 | extempore | 2010-04-04 04:58:11 +0200 (Sun, 04 Apr 2010) | 2 lines Nipped the infinite loop which is presently launched by an attempt to run test.continuations.suite with -optimise. No review. ........ r21327 | rompf | 2010-04-04 15:14:44 +0200 (Sun, 04 Apr 2010) | 1 line workaround for #3252. review by extempore. ........ r21328 | extempore | 2010-04-04 18:59:25 +0200 (Sun, 04 Apr 2010) | 1 line Removing a class cast exception. Closes #2843, no review. ........ r21329 | extempore | 2010-04-05 08:24:22 +0200 (Mon, 05 Apr 2010) | 13 lines If I work on this patch any longer without checking in I will go stark raving mad. It is broken up into a couple pieces. This one is the changes to test/. It includes fixing a bunch of tests, removing deprecated constructs, moving jars used by tests to the most specific plausible location rather than having all jars on the classpath of all tests, and some filesystem layout change (continuations get their whole own srcpath.) This would be the world's most tedious review, so let's say no review. [Note: after this commit, I doubt things will build very smoothly until the rest of the partest changes follow. Which should only be seconds, but just in case.] ........ r21330 | extempore | 2010-04-05 08:25:16 +0200 (Mon, 05 Apr 2010) | 7 lines The code part of the partest patch. If anyone wants to review it they can be my guest (reviewbot: review by community!) More realistically: more than likely I have unwittingly altered or impaired some piece of functionality used by someone somewhere. Please alert me if this is the case and I will remedy it. I have to call it at this point as the best interests of 2.8 cannot be served by me nursing this patch along any further. ........ r21331 | odersky | 2010-04-05 15:47:27 +0200 (Mon, 05 Apr 2010) | 2 lines Rearranging IndexedSeq/LinearSeq and related work ........ r21332 | odersky | 2010-04-05 18:53:53 +0200 (Mon, 05 Apr 2010) | 2 lines Made Vector the standard impl of IndexedSeq. Review by rompf. ........ r21333 | extempore | 2010-04-05 19:40:59 +0200 (Mon, 05 Apr 2010) | 1 line Typo patrol, no review. ........ r21341 | extempore | 2010-04-06 02:40:25 +0200 (Tue, 06 Apr 2010) | 1 line A removal that didn't take. No review. ........ r21342 | extempore | 2010-04-06 02:42:50 +0200 (Tue, 06 Apr 2010) | 4 lines Fix for the partest task to fail the build when a test fails, and fixes for 2/3 of the quietly failing tests. I'm not quite sure what to do about the view ones, it doesn't look like a simple rename is going to cut it, so: review by odersky. ........ r21343 | extempore | 2010-04-06 03:26:31 +0200 (Tue, 06 Apr 2010) | 5 lines As a brief diversion from real work, implemented Damerau–Levenshtein and ran it on trunk to elicit obvious misspellings. Unfortunately they're mostly in places like compiler comments which real people never see, but I fixed them anyway. All those English Lit majors who peruse our sources are sure to be pleased. No review. ........ r21344 | extempore | 2010-04-06 04:05:20 +0200 (Tue, 06 Apr 2010) | 1 line Noticed a bug with test obj dirs not getting deleted. No review. ........ r21345 | extempore | 2010-04-06 07:18:46 +0200 (Tue, 06 Apr 2010) | 2 lines A couple more bits of partest I discovered weren't doing their jobs. Some of my classiest messages were going unheard! No review. ........ r21346 | extempore | 2010-04-06 07:19:19 +0200 (Tue, 06 Apr 2010) | 2 lines Some tweaks to classpath handling I had left over from trying to figure out the continuations plugin issue. No review. ........ r21347 | imaier | 2010-04-06 13:43:00 +0200 (Tue, 06 Apr 2010) | 1 line Fixed #3257 ........ r21348 | odersky | 2010-04-06 15:53:39 +0200 (Tue, 06 Apr 2010) | 1 line Optimized toArray for ArrayOps and WrappedArrays. Changed printing of Views. Fixed IndexedseqView problems. Review by extempore. ........ r21349 | prokopec | 2010-04-06 16:39:51 +0200 (Tue, 06 Apr 2010) | 1 line Fixes #2535. Review by community. ........ r21350 | prokopec | 2010-04-06 16:56:14 +0200 (Tue, 06 Apr 2010) | 1 line Forgot to add scalacheck test for #2535. Review by community. ........ r21351 | extempore | 2010-04-06 17:09:02 +0200 (Tue, 06 Apr 2010) | 1 line Final methods should appear in scaladoc. Closes #3067, no review. ........ r21353 | extempore | 2010-04-06 20:27:29 +0200 (Tue, 06 Apr 2010) | 1 line Removing some code duplication from scaladoc. Review by dubochet. ........ r21354 | extempore | 2010-04-07 00:46:22 +0200 (Wed, 07 Apr 2010) | 5 lines Fixed another partest feature I'd managed to break at the very last minute. When a test is too slow finishing, there will be messages identifying the test. It defaults to 90 seconds before the first warning because I know some machines are slow, but it'd be nice if that was more like 30. No review. ........ r21356 | extempore | 2010-04-07 01:57:11 +0200 (Wed, 07 Apr 2010) | 2 lines And another partest gap is filled. Now if you pass --quick to partest it really will use quick as the build dir. No review. ........
Diffstat (limited to 'src/library')
-rw-r--r--src/library/scala/Application.scala3
-rw-r--r--src/library/scala/Array.scala23
-rw-r--r--src/library/scala/Console.scala2
-rw-r--r--src/library/scala/Enumeration.scala94
-rw-r--r--src/library/scala/Function.scala11
-rw-r--r--src/library/scala/Immutable.scala2
-rw-r--r--src/library/scala/LowPriorityImplicits.scala26
-rw-r--r--src/library/scala/NotDefinedError.scala1
-rw-r--r--src/library/scala/Option.scala21
-rw-r--r--src/library/scala/Predef.scala72
-rw-r--r--src/library/scala/Product.scala17
-rw-r--r--src/library/scala/Tuple2.scala5
-rw-r--r--src/library/scala/Tuple3.scala5
-rw-r--r--src/library/scala/annotation/elidable.scala8
-rw-r--r--src/library/scala/annotation/migration.scala28
-rw-r--r--src/library/scala/collection/BitSetLike.scala5
-rw-r--r--src/library/scala/collection/IndexedSeq.scala10
-rw-r--r--src/library/scala/collection/IndexedSeqLike.scala274
-rwxr-xr-xsrc/library/scala/collection/IndexedSeqOptimized.scala293
-rw-r--r--src/library/scala/collection/IndexedSeqView.scala38
-rw-r--r--src/library/scala/collection/IndexedSeqViewLike.scala113
-rw-r--r--src/library/scala/collection/IterableLike.scala8
-rw-r--r--src/library/scala/collection/IterableProxyLike.scala26
-rw-r--r--src/library/scala/collection/IterableViewLike.scala11
-rw-r--r--src/library/scala/collection/Iterator.scala359
-rw-r--r--src/library/scala/collection/JavaConversions.scala15
-rw-r--r--src/library/scala/collection/LinearSeq.scala12
-rw-r--r--src/library/scala/collection/LinearSeqLike.scala282
-rwxr-xr-xsrc/library/scala/collection/LinearSeqOptimized.scala301
-rw-r--r--src/library/scala/collection/MapLike.scala54
-rw-r--r--src/library/scala/collection/MapProxyLike.scala5
-rw-r--r--src/library/scala/collection/SeqLike.scala107
-rw-r--r--src/library/scala/collection/SeqProxyLike.scala33
-rw-r--r--src/library/scala/collection/SeqView.scala2
-rw-r--r--src/library/scala/collection/SeqViewLike.scala49
-rw-r--r--src/library/scala/collection/SetLike.scala2
-rw-r--r--src/library/scala/collection/SortedMap.scala4
-rw-r--r--src/library/scala/collection/Traversable.scala3
-rw-r--r--src/library/scala/collection/TraversableLike.scala511
-rw-r--r--src/library/scala/collection/TraversableOnce.scala522
-rw-r--r--src/library/scala/collection/TraversableProxy.scala2
-rw-r--r--src/library/scala/collection/TraversableProxyLike.scala60
-rw-r--r--src/library/scala/collection/TraversableView.scala2
-rw-r--r--src/library/scala/collection/TraversableViewLike.scala60
-rw-r--r--src/library/scala/collection/generic/Addable.scala13
-rw-r--r--src/library/scala/collection/generic/GenericTraversableTemplate.scala4
-rw-r--r--src/library/scala/collection/generic/Growable.scala18
-rw-r--r--src/library/scala/collection/generic/IterableForwarder.scala4
-rw-r--r--src/library/scala/collection/generic/SeqForwarder.scala25
-rw-r--r--src/library/scala/collection/generic/Shrinkable.scala10
-rw-r--r--src/library/scala/collection/generic/Sorted.scala27
-rw-r--r--src/library/scala/collection/generic/Subtractable.scala11
-rw-r--r--src/library/scala/collection/generic/TraversableFactory.scala2
-rw-r--r--src/library/scala/collection/generic/TraversableForwarder.scala38
-rw-r--r--src/library/scala/collection/generic/TraversableView.scala.1152
-rwxr-xr-xsrc/library/scala/collection/immutable/DefaultMap.scala53
-rw-r--r--src/library/scala/collection/immutable/HashMap.scala487
-rw-r--r--src/library/scala/collection/immutable/HashSet.scala421
-rw-r--r--src/library/scala/collection/immutable/IndexedSeq.scala7
-rw-r--r--src/library/scala/collection/immutable/IntMap.scala5
-rw-r--r--src/library/scala/collection/immutable/LinearSeq.scala2
-rw-r--r--src/library/scala/collection/immutable/List.scala33
-rw-r--r--src/library/scala/collection/immutable/ListMap.scala2
-rw-r--r--src/library/scala/collection/immutable/LongMap.scala4
-rw-r--r--src/library/scala/collection/immutable/Map.scala20
-rw-r--r--src/library/scala/collection/immutable/MapLike.scala58
-rw-r--r--src/library/scala/collection/immutable/MapProxy.scala4
-rw-r--r--src/library/scala/collection/immutable/NumericRange.scala31
-rw-r--r--src/library/scala/collection/immutable/PagedSeq.scala6
-rw-r--r--src/library/scala/collection/immutable/Queue.scala29
-rw-r--r--src/library/scala/collection/immutable/Range.scala96
-rw-r--r--src/library/scala/collection/immutable/RedBlack.scala3
-rw-r--r--src/library/scala/collection/immutable/Set.scala14
-rw-r--r--src/library/scala/collection/immutable/SortedMap.scala14
-rw-r--r--src/library/scala/collection/immutable/Stack.scala16
-rw-r--r--src/library/scala/collection/immutable/Stream.scala28
-rw-r--r--src/library/scala/collection/immutable/StringLike.scala4
-rw-r--r--src/library/scala/collection/immutable/TreeSet.scala4
-rw-r--r--src/library/scala/collection/immutable/Vector.scala177
-rw-r--r--src/library/scala/collection/interfaces/MapMethods.scala8
-rw-r--r--src/library/scala/collection/interfaces/SeqMethods.scala2
-rw-r--r--src/library/scala/collection/interfaces/SetMethods.scala6
-rw-r--r--src/library/scala/collection/interfaces/TraversableMethods.scala5
-rw-r--r--src/library/scala/collection/interfaces/TraversableOnceMethods.scala69
-rw-r--r--src/library/scala/collection/mutable/AddingBuilder.scala2
-rw-r--r--src/library/scala/collection/mutable/ArrayBuffer.scala14
-rw-r--r--src/library/scala/collection/mutable/ArrayBuilder.scala20
-rw-r--r--src/library/scala/collection/mutable/ArrayLike.scala2
-rw-r--r--src/library/scala/collection/mutable/ArrayOps.scala6
-rw-r--r--src/library/scala/collection/mutable/ArraySeq.scala (renamed from src/library/scala/collection/mutable/GenericArray.scala)16
-rw-r--r--src/library/scala/collection/mutable/ArrayStack.scala10
-rw-r--r--src/library/scala/collection/mutable/BufferLike.scala137
-rw-r--r--src/library/scala/collection/mutable/BufferProxy.scala11
-rw-r--r--src/library/scala/collection/mutable/Builder.scala5
-rw-r--r--src/library/scala/collection/mutable/ConcurrentMap.scala18
-rw-r--r--src/library/scala/collection/mutable/DoubleLinkedList.scala2
-rw-r--r--src/library/scala/collection/mutable/FlatHashTable.scala4
-rw-r--r--src/library/scala/collection/mutable/GrowingBuilder.scala30
-rw-r--r--src/library/scala/collection/mutable/HashMap.scala2
-rw-r--r--src/library/scala/collection/mutable/HashSet.scala2
-rw-r--r--src/library/scala/collection/mutable/HashTable.scala2
-rw-r--r--src/library/scala/collection/mutable/ImmutableMapAdaptor.scala13
-rw-r--r--src/library/scala/collection/mutable/IndexedSeq.scala3
-rwxr-xr-xsrc/library/scala/collection/mutable/IndexedSeqOptimized.scala21
-rw-r--r--src/library/scala/collection/mutable/IndexedSeqView.scala28
-rw-r--r--src/library/scala/collection/mutable/LazyBuilder.scala5
-rw-r--r--src/library/scala/collection/mutable/LinearSeq.scala5
-rw-r--r--src/library/scala/collection/mutable/LinkedListLike.scala2
-rw-r--r--src/library/scala/collection/mutable/ListBuffer.scala4
-rw-r--r--src/library/scala/collection/mutable/ListMap.scala2
-rw-r--r--src/library/scala/collection/mutable/MapLike.scala114
-rw-r--r--src/library/scala/collection/mutable/MapLikeBase.scala37
-rw-r--r--src/library/scala/collection/mutable/MapProxy.scala10
-rw-r--r--src/library/scala/collection/mutable/MultiMap.scala3
-rw-r--r--src/library/scala/collection/mutable/MutableList.scala2
-rw-r--r--src/library/scala/collection/mutable/OpenHashMap.scala2
-rw-r--r--src/library/scala/collection/mutable/PriorityQueue.scala55
-rw-r--r--src/library/scala/collection/mutable/PriorityQueueProxy.scala12
-rw-r--r--src/library/scala/collection/mutable/Publisher.scala2
-rw-r--r--src/library/scala/collection/mutable/Queue.scala6
-rw-r--r--src/library/scala/collection/mutable/QueueProxy.scala13
-rw-r--r--src/library/scala/collection/mutable/ResizableArray.scala2
-rw-r--r--src/library/scala/collection/mutable/SetBuilder.scala15
-rw-r--r--src/library/scala/collection/mutable/SetLike.scala92
-rw-r--r--src/library/scala/collection/mutable/Stack.scala27
-rw-r--r--src/library/scala/collection/mutable/StackProxy.scala25
-rw-r--r--src/library/scala/collection/mutable/SynchronizedBuffer.scala24
-rw-r--r--src/library/scala/collection/mutable/SynchronizedMap.scala12
-rw-r--r--src/library/scala/collection/mutable/SynchronizedPriorityQueue.scala16
-rw-r--r--src/library/scala/collection/mutable/SynchronizedQueue.scala10
-rw-r--r--src/library/scala/collection/mutable/SynchronizedSet.scala18
-rw-r--r--src/library/scala/collection/mutable/SynchronizedStack.scala10
-rw-r--r--src/library/scala/collection/mutable/WeakHashMap.scala11
-rw-r--r--src/library/scala/collection/mutable/WrappedArray.scala7
-rwxr-xr-xsrc/library/scala/collection/readme-if-you-want-to-add-something.txt50
-rw-r--r--src/library/scala/compat/Platform.scala2
-rw-r--r--src/library/scala/concurrent/DelayedLazyVal.scala10
-rw-r--r--src/library/scala/math/BigDecimal.scala7
-rw-r--r--src/library/scala/math/BigInt.scala2
-rw-r--r--src/library/scala/math/Numeric.scala15
-rw-r--r--src/library/scala/math/Ordering.scala2
-rw-r--r--src/library/scala/package.scala2
-rw-r--r--src/library/scala/reflect/ClassManifest.scala18
-rw-r--r--src/library/scala/reflect/Code.scala2
-rw-r--r--src/library/scala/reflect/Manifest.scala24
-rwxr-xr-x[-rw-r--r--]src/library/scala/reflect/NameTransformer.scala (renamed from src/library/scala/util/NameTransformer.scala)4
-rw-r--r--src/library/scala/reflect/ScalaSignature.java13
-rwxr-xr-xsrc/library/scala/reflect/generic/AnnotationInfos.scala50
-rw-r--r--src/library/scala/reflect/generic/ByteCodecs.scala209
-rwxr-xr-xsrc/library/scala/reflect/generic/Constants.scala236
-rwxr-xr-xsrc/library/scala/reflect/generic/Flags.scala198
-rwxr-xr-xsrc/library/scala/reflect/generic/Names.scala21
-rwxr-xr-xsrc/library/scala/reflect/generic/PickleBuffer.scala188
-rwxr-xr-xsrc/library/scala/reflect/generic/PickleFormat.scala223
-rwxr-xr-xsrc/library/scala/reflect/generic/Scopes.scala15
-rwxr-xr-xsrc/library/scala/reflect/generic/StandardDefinitions.scala67
-rwxr-xr-xsrc/library/scala/reflect/generic/StdNames.scala26
-rwxr-xr-xsrc/library/scala/reflect/generic/Symbols.scala194
-rwxr-xr-xsrc/library/scala/reflect/generic/Trees.scala738
-rwxr-xr-xsrc/library/scala/reflect/generic/Types.scala156
-rwxr-xr-xsrc/library/scala/reflect/generic/UnPickler.scala775
-rwxr-xr-xsrc/library/scala/reflect/generic/Universe.scala16
-rw-r--r--src/library/scala/runtime/BoxesRunTime.java103
-rw-r--r--src/library/scala/runtime/NonLocalReturnControl.scala16
-rw-r--r--src/library/scala/runtime/NonLocalReturnException.scala7
-rw-r--r--src/library/scala/runtime/RichChar.scala24
-rw-r--r--src/library/scala/runtime/ScalaRunTime.scala86
-rw-r--r--src/library/scala/testing/SUnit.scala20
-rw-r--r--src/library/scala/throws.scala14
-rw-r--r--src/library/scala/util/Properties.scala107
-rw-r--r--src/library/scala/util/Random.scala14
-rw-r--r--src/library/scala/util/Sorting.scala106
-rw-r--r--src/library/scala/util/automata/SubsetConstruction.scala2
-rw-r--r--src/library/scala/util/automata/WordBerrySethi.scala2
-rw-r--r--src/library/scala/util/control/Breaks.scala6
-rw-r--r--src/library/scala/util/control/ControlThrowable.scala (renamed from src/library/scala/util/control/ControlException.scala)8
-rw-r--r--src/library/scala/util/logging/ConsoleLogger.scala2
-rw-r--r--src/library/scala/util/matching/Regex.scala60
-rw-r--r--src/library/scala/util/parsing/ast/Binders.scala22
-rw-r--r--src/library/scala/util/parsing/combinator/Parsers.scala52
-rw-r--r--src/library/scala/util/parsing/combinator/lexical/Lexical.scala9
-rw-r--r--src/library/scala/util/parsing/combinator/lexical/Scanners.scala16
-rw-r--r--src/library/scala/util/parsing/combinator/lexical/StdLexical.scala9
-rw-r--r--src/library/scala/util/parsing/combinator/syntactical/StandardTokenParsers.scala10
-rw-r--r--src/library/scala/util/parsing/combinator/syntactical/StdTokenParsers.scala9
-rw-r--r--src/library/scala/util/parsing/combinator/syntactical/TokenParsers.scala16
-rw-r--r--src/library/scala/util/parsing/combinator/token/StdTokens.scala (renamed from src/library/scala/util/parsing/syntax/StdTokens.scala)4
-rw-r--r--src/library/scala/util/parsing/combinator/token/Tokens.scala (renamed from src/library/scala/util/parsing/syntax/Tokens.scala)4
-rw-r--r--src/library/scala/util/parsing/input/Position.scala2
-rw-r--r--src/library/scala/util/parsing/syntax/package.scala19
-rw-r--r--src/library/scala/xml/Atom.scala18
-rw-r--r--src/library/scala/xml/Attribute.scala46
-rw-r--r--src/library/scala/xml/Comment.scala2
-rw-r--r--src/library/scala/xml/Document.scala4
-rw-r--r--src/library/scala/xml/Elem.scala18
-rw-r--r--src/library/scala/xml/EntityRef.scala2
-rw-r--r--src/library/scala/xml/Equality.scala115
-rw-r--r--src/library/scala/xml/Group.scala58
-rw-r--r--src/library/scala/xml/MetaData.scala47
-rw-r--r--src/library/scala/xml/NamespaceBinding.scala13
-rw-r--r--src/library/scala/xml/Node.scala65
-rw-r--r--src/library/scala/xml/NodeBuffer.scala3
-rw-r--r--src/library/scala/xml/NodeSeq.scala38
-rw-r--r--src/library/scala/xml/Null.scala57
-rw-r--r--src/library/scala/xml/PCData.scala7
-rw-r--r--src/library/scala/xml/PrefixedAttribute.scala52
-rw-r--r--src/library/scala/xml/PrettyPrinter.scala5
-rw-r--r--src/library/scala/xml/ProcInstr.scala1
-rw-r--r--src/library/scala/xml/SpecialNode.scala4
-rw-r--r--src/library/scala/xml/Text.scala10
-rw-r--r--src/library/scala/xml/TextBuffer.scala4
-rw-r--r--src/library/scala/xml/TopScope.scala2
-rw-r--r--src/library/scala/xml/Unparsed.scala7
-rw-r--r--src/library/scala/xml/UnprefixedAttribute.scala36
-rw-r--r--src/library/scala/xml/Utility.scala38
-rw-r--r--src/library/scala/xml/XML.scala16
-rw-r--r--src/library/scala/xml/dtd/ContentModel.scala5
-rw-r--r--src/library/scala/xml/dtd/ContentModelParser.scala3
-rw-r--r--src/library/scala/xml/dtd/DTD.scala22
-rw-r--r--src/library/scala/xml/dtd/Decl.scala6
-rw-r--r--src/library/scala/xml/dtd/DocType.scala3
-rw-r--r--src/library/scala/xml/dtd/ElementValidator.scala4
-rw-r--r--src/library/scala/xml/dtd/ExternalID.scala5
-rw-r--r--src/library/scala/xml/dtd/Scanner.scala4
-rw-r--r--src/library/scala/xml/factory/Binder.scala2
-rw-r--r--src/library/scala/xml/factory/NodeFactory.scala4
-rw-r--r--src/library/scala/xml/factory/XMLLoader.scala4
-rw-r--r--src/library/scala/xml/include/XIncludeException.scala2
-rw-r--r--src/library/scala/xml/include/sax/Main.scala7
-rw-r--r--src/library/scala/xml/include/sax/XIncludeFilter.scala93
-rw-r--r--src/library/scala/xml/include/sax/XIncluder.scala24
-rw-r--r--src/library/scala/xml/package.scala18
-rw-r--r--src/library/scala/xml/parsing/ConstructingParser.scala43
-rw-r--r--src/library/scala/xml/parsing/DefaultMarkupHandler.scala2
-rw-r--r--src/library/scala/xml/parsing/FactoryAdapter.scala11
-rw-r--r--src/library/scala/xml/parsing/FatalError.scala7
-rw-r--r--src/library/scala/xml/parsing/MarkupHandler.scala7
-rw-r--r--src/library/scala/xml/parsing/MarkupParser.scala255
-rw-r--r--src/library/scala/xml/parsing/MarkupParserCommon.scala180
-rw-r--r--src/library/scala/xml/parsing/NoBindingFactoryAdapter.scala4
-rw-r--r--src/library/scala/xml/parsing/TokenTests.scala2
-rw-r--r--src/library/scala/xml/parsing/ValidatingMarkupHandler.scala2
-rw-r--r--src/library/scala/xml/parsing/XhtmlEntities.scala3
243 files changed, 7695 insertions, 3991 deletions
diff --git a/src/library/scala/Application.scala b/src/library/scala/Application.scala
index e9b97b5356..fdb122f5bf 100644
--- a/src/library/scala/Application.scala
+++ b/src/library/scala/Application.scala
@@ -11,7 +11,6 @@
package scala
-import java.lang.System.getProperty
import scala.compat.Platform.currentTime
/** <p>
@@ -84,7 +83,7 @@ trait Application {
* @param args the arguments passed to the main method
*/
def main(args: Array[String]) {
- if (getProperty("scala.time") ne null) {
+ if (util.Properties.propIsSet("scala.time")) {
val total = currentTime - executionStart
Console.println("[total " + total + "ms]")
}
diff --git a/src/library/scala/Array.scala b/src/library/scala/Array.scala
index afaaed7c7c..f89e8b48a5 100644
--- a/src/library/scala/Array.scala
+++ b/src/library/scala/Array.scala
@@ -12,7 +12,7 @@
package scala
import scala.collection.generic._
-import scala.collection.mutable.{ArrayBuilder, GenericArray}
+import scala.collection.mutable.{ArrayBuilder, ArraySeq}
import compat.Platform.arraycopy
import scala.reflect.ClassManifest
import scala.runtime.ScalaRunTime.{array_apply, array_update}
@@ -24,15 +24,15 @@ class FallbackArrayBuilding {
/** A builder factory that generates a generic array.
* Called instead of Array.newBuilder if the element type of an array
- * does not have a class manifest. Note that fallbackBuilder fcatory
+ * does not have a class manifest. Note that fallbackBuilder factory
* needs an implicit parameter (otherwise it would not be dominated in implicit search
* by Array.canBuildFrom). We make sure that that implicit search is always
- * succesfull.
+ * successfull.
*/
- implicit def fallbackCanBuildFrom[T](implicit m: DummyImplicit): CanBuildFrom[Array[_], T, GenericArray[T]] =
- new CanBuildFrom[Array[_], T, GenericArray[T]] {
- def apply(from: Array[_]) = GenericArray.newBuilder[T]
- def apply() = GenericArray.newBuilder[T]
+ implicit def fallbackCanBuildFrom[T](implicit m: DummyImplicit): CanBuildFrom[Array[_], T, ArraySeq[T]] =
+ new CanBuildFrom[Array[_], T, ArraySeq[T]] {
+ def apply(from: Array[_]) = ArraySeq.newBuilder[T]
+ def apply() = ArraySeq.newBuilder[T]
}
}
@@ -55,10 +55,13 @@ object Array extends FallbackArrayBuilding {
dest : AnyRef,
destPos : Int,
length : Int) {
- var i = 0
- while (i < length) {
- array_update(dest, i, array_apply(src, i))
+ var i = srcPos
+ var j = destPos
+ val srcUntil = srcPos + length
+ while (i < srcUntil) {
+ array_update(dest, j, array_apply(src, i))
i += 1
+ j += 1
}
}
diff --git a/src/library/scala/Console.scala b/src/library/scala/Console.scala
index 7923b6be65..fc33fa07ef 100644
--- a/src/library/scala/Console.scala
+++ b/src/library/scala/Console.scala
@@ -83,7 +83,7 @@ object Console {
/** Set the default output stream.
*
- * @param@ out the new output stream.
+ * @param out the new output stream.
*/
def setOut(out: OutputStream): Unit =
setOut(new PrintStream(out))
diff --git a/src/library/scala/Enumeration.scala b/src/library/scala/Enumeration.scala
index dfe48e3d00..3c8f5cf0bd 100644
--- a/src/library/scala/Enumeration.scala
+++ b/src/library/scala/Enumeration.scala
@@ -16,6 +16,15 @@ import scala.collection.mutable.{Builder, AddingBuilder, Map, HashMap}
import scala.collection.immutable.{Set, BitSet}
import scala.collection.generic.CanBuildFrom
+private object Enumeration {
+
+ /* This map is used to cache enumeration instances for
+ resolving enumeration _values_ to equal objects (by-reference)
+ when values are deserialized. */
+ private val emap: Map[Class[_], Enumeration] = new HashMap
+
+}
+
/** <p>
* Defines a finite set of values specific to the enumeration. Typically
* these values enumerate all possible forms something can take and provide a
@@ -56,11 +65,32 @@ import scala.collection.generic.CanBuildFrom
*/
@serializable
@SerialVersionUID(8476000850333817230L)
-abstract class Enumeration(initial: Int, names: String*) {
+abstract class Enumeration(initial: Int, names: String*) { thisenum =>
def this() = this(0, null)
def this(names: String*) = this(0, names: _*)
+ Enumeration.synchronized {
+ Enumeration.emap.get(getClass) match {
+ case None =>
+ Enumeration.emap += (getClass -> this)
+ case Some(_) =>
+ /* do nothing */
+ }
+ }
+
+ /* Note that `readResolve` cannot be private, since otherwise
+ the JVM does not invoke it when deserializing subclasses. */
+ protected def readResolve(): AnyRef = Enumeration.synchronized {
+ Enumeration.emap.get(getClass) match {
+ case None =>
+ Enumeration.emap += (getClass -> this)
+ this
+ case Some(existing) =>
+ existing
+ }
+ }
+
/** The name of this enumeration.
*/
override def toString = {
@@ -90,7 +120,7 @@ abstract class Enumeration(initial: Int, names: String*) {
*/
def values: ValueSet = {
if (!vsetDefined) {
- vset = new ValueSet(BitSet.empty ++ (vmap.valuesIterator map (_.id)))
+ vset = new ValueSet(BitSet.empty ++ (vmap.values map (_.id)))
vsetDefined = true
}
vset
@@ -164,34 +194,41 @@ abstract class Enumeration(initial: Int, names: String*) {
/* Obtains the name for the value with id `i`. If no name is cached
* in `nmap`, it populates `nmap` using reflection.
*/
- private def nameOf(i: Int): String = nmap.get(i) match {
- case Some(name) => name
- case None =>
- val methods = getClass.getMethods
- for (m <- methods
- if classOf[Value].isAssignableFrom(m.getReturnType) &&
- !java.lang.reflect.Modifier.isFinal(m.getModifiers)) {
- val name = m.getName
- // invoke method to obtain actual `Value` instance
- val value = m.invoke(this)
- // invoke `id` method
- val idMeth = classOf[Val].getMethod("id")
- val id: Int = idMeth.invoke(value).asInstanceOf[java.lang.Integer].intValue()
- nmap += (id -> name)
- }
- nmap(i)
+ private def nameOf(i: Int): String = synchronized {
+ def isValDef(m: java.lang.reflect.Method) =
+ getClass.getDeclaredFields.exists(fd => fd.getName == m.getName &&
+ fd.getType == m.getReturnType)
+ nmap.get(i) match {
+ case Some(name) => name
+ case None =>
+ val methods = getClass.getMethods
+ for (m <- methods
+ if (classOf[Value].isAssignableFrom(m.getReturnType) &&
+ !java.lang.reflect.Modifier.isFinal(m.getModifiers) &&
+ m.getParameterTypes.isEmpty &&
+ isValDef(m))) {
+ val name = m.getName
+ // invoke method to obtain actual `Value` instance
+ val value = m.invoke(this)
+ // invoke `id` method
+ val idMeth = classOf[Val].getMethod("id")
+ val id: Int = idMeth.invoke(value).asInstanceOf[java.lang.Integer].intValue()
+ nmap += (id -> name)
+ }
+ nmap(i)
+ }
}
/** The type of the enumerated values. */
@serializable
@SerialVersionUID(7091335633555234129L)
- abstract class Value extends Ordered[Enumeration#Value] {
+ abstract class Value extends Ordered[Value] {
/** the id and bit location of this enumeration value */
def id: Int
- override def compare(that: Enumeration#Value): Int = this.id - that.id
+ override def compare(that: Value): Int = this.id - that.id
override def equals(other: Any): Boolean =
other match {
- case that: Enumeration#Value => compare(that) == 0
+ case that: thisenum.Value => compare(that) == 0
case _ => false
}
override def hashCode: Int = id.hashCode
@@ -204,7 +241,7 @@ abstract class Enumeration(initial: Int, names: String*) {
if (id >= 32) throw new IllegalArgumentException
1 << id
}
- /** this enumeration value as an <code>Long</code> bit mask.
+ /** this enumeration value as a <code>Long</code> bit mask.
* @throws IllegalArgumentException if <code>id</code> is greater than 63
*/
@deprecated("mask64 will be removed")
@@ -216,7 +253,7 @@ abstract class Enumeration(initial: Int, names: String*) {
/** A class implementing the <a href="Enumeration.Value.html"
* target="contentFrame"><code>Value</code></a> type. This class can be
- * overriden to change the enumeration's naming and integer identification
+ * overridden to change the enumeration's naming and integer identification
* behaviour.
*/
@serializable
@@ -236,9 +273,16 @@ abstract class Enumeration(initial: Int, names: String*) {
override def toString() =
if (name eq null) Enumeration.this.nameOf(i)
else name
- private def readResolve(): AnyRef =
- if (vmap ne null) vmap(i)
+ protected def readResolve(): AnyRef = {
+ val enum = Enumeration.synchronized {
+ Enumeration.emap.get(Enumeration.this.getClass) match {
+ case None => Enumeration.this
+ case Some(existing) => existing
+ }
+ }
+ if (enum.vmap ne null) enum.vmap(i)
else this
+ }
}
/** A class for sets of values
diff --git a/src/library/scala/Function.scala b/src/library/scala/Function.scala
index 0409d938fd..6ef137aa2b 100644
--- a/src/library/scala/Function.scala
+++ b/src/library/scala/Function.scala
@@ -93,10 +93,13 @@ object Function
/** Tupling for functions of arity 2. This transforms a function
* of arity 2 into a unary function that takes a pair of arguments.
*
+ * @note These functions are slotted for deprecation, but it is on
+ * hold pending superior type inference for tupling anonymous functions.
+ *
* @param f ...
* @return ...
*/
- @deprecated("Use `f.tupled` instead")
+ // @deprecated("Use `f.tupled` instead")
def tupled[a1, a2, b](f: (a1, a2) => b): Tuple2[a1, a2] => b = {
case Tuple2(x1, x2) => f(x1, x2)
}
@@ -104,7 +107,7 @@ object Function
/** Tupling for functions of arity 3. This transforms a function
* of arity 3 into a unary function that takes a triple of arguments.
*/
- @deprecated("Use `f.tupled` instead")
+ // @deprecated("Use `f.tupled` instead")
def tupled[a1, a2, a3, b](f: (a1, a2, a3) => b): Tuple3[a1, a2, a3] => b = {
case Tuple3(x1, x2, x3) => f(x1, x2, x3)
}
@@ -112,7 +115,7 @@ object Function
/** Tupling for functions of arity 4. This transforms a function
* of arity 4 into a unary function that takes a 4-tuple of arguments.
*/
- @deprecated("Use `f.tupled` instead")
+ // @deprecated("Use `f.tupled` instead")
def tupled[a1, a2, a3, a4, b](f: (a1, a2, a3, a4) => b): Tuple4[a1, a2, a3, a4] => b = {
case Tuple4(x1, x2, x3, x4) => f(x1, x2, x3, x4)
}
@@ -120,7 +123,7 @@ object Function
/** Tupling for functions of arity 5. This transforms a function
* of arity 5 into a unary function that takes a 5-tuple of arguments.
*/
- @deprecated("Use `f.tupled` instead")
+ // @deprecated("Use `f.tupled` instead")
def tupled[a1, a2, a3, a4, a5, b](f: (a1, a2, a3, a4, a5) => b): Tuple5[a1, a2, a3, a4, a5] => b = {
case Tuple5(x1, x2, x3, x4, x5) => f(x1, x2, x3, x4, x5)
}
diff --git a/src/library/scala/Immutable.scala b/src/library/scala/Immutable.scala
index 3b6fe28d52..bc0a6100f6 100644
--- a/src/library/scala/Immutable.scala
+++ b/src/library/scala/Immutable.scala
@@ -11,7 +11,7 @@
package scala
-/** A marker trait for all immutable datastructures such as imutable
+/** A marker trait for all immutable datastructures such as immutable
* collections.
*
* @since 2.8
diff --git a/src/library/scala/LowPriorityImplicits.scala b/src/library/scala/LowPriorityImplicits.scala
index d5a3727f66..899dbe27d7 100644
--- a/src/library/scala/LowPriorityImplicits.scala
+++ b/src/library/scala/LowPriorityImplicits.scala
@@ -26,21 +26,21 @@ import collection.generic.CanBuildFrom
class LowPriorityImplicits {
implicit def genericWrapArray[T](xs: Array[T]): WrappedArray[T] =
- WrappedArray.make(xs)
+ if (xs ne null) WrappedArray.make(xs) else null
- implicit def wrapRefArray[T <: AnyRef](xs: Array[T]): WrappedArray[T] = new WrappedArray.ofRef[T](xs)
- implicit def wrapIntArray(xs: Array[Int]): WrappedArray[Int] = new WrappedArray.ofInt(xs)
- implicit def wrapDoubleArray(xs: Array[Double]): WrappedArray[Double] = new WrappedArray.ofDouble(xs)
- implicit def wrapLongArray(xs: Array[Long]): WrappedArray[Long] = new WrappedArray.ofLong(xs)
- implicit def wrapFloatArray(xs: Array[Float]): WrappedArray[Float] = new WrappedArray.ofFloat(xs)
- implicit def wrapCharArray(xs: Array[Char]): WrappedArray[Char] = new WrappedArray.ofChar(xs)
- implicit def wrapByteArray(xs: Array[Byte]): WrappedArray[Byte] = new WrappedArray.ofByte(xs)
- implicit def wrapShortArray(xs: Array[Short]): WrappedArray[Short] = new WrappedArray.ofShort(xs)
- implicit def wrapBooleanArray(xs: Array[Boolean]): WrappedArray[Boolean] = new WrappedArray.ofBoolean(xs)
- implicit def wrapUnitArray(xs: Array[Unit]): WrappedArray[Unit] = new WrappedArray.ofUnit(xs)
+ implicit def wrapRefArray[T <: AnyRef](xs: Array[T]): WrappedArray[T] = if (xs ne null) new WrappedArray.ofRef[T](xs) else null
+ implicit def wrapIntArray(xs: Array[Int]): WrappedArray[Int] = if (xs ne null) new WrappedArray.ofInt(xs) else null
+ implicit def wrapDoubleArray(xs: Array[Double]): WrappedArray[Double] = if (xs ne null) new WrappedArray.ofDouble(xs) else null
+ implicit def wrapLongArray(xs: Array[Long]): WrappedArray[Long] = if (xs ne null) new WrappedArray.ofLong(xs) else null
+ implicit def wrapFloatArray(xs: Array[Float]): WrappedArray[Float] = if (xs ne null) new WrappedArray.ofFloat(xs) else null
+ implicit def wrapCharArray(xs: Array[Char]): WrappedArray[Char] = if (xs ne null) new WrappedArray.ofChar(xs) else null
+ implicit def wrapByteArray(xs: Array[Byte]): WrappedArray[Byte] = if (xs ne null) new WrappedArray.ofByte(xs) else null
+ implicit def wrapShortArray(xs: Array[Short]): WrappedArray[Short] = if (xs ne null) new WrappedArray.ofShort(xs) else null
+ implicit def wrapBooleanArray(xs: Array[Boolean]): WrappedArray[Boolean] = if (xs ne null) new WrappedArray.ofBoolean(xs) else null
+ implicit def wrapUnitArray(xs: Array[Unit]): WrappedArray[Unit] = if (xs ne null) new WrappedArray.ofUnit(xs) else null
- implicit def wrapString(s: String): WrappedString = new WrappedString(s)
- implicit def unwrapString(ws: WrappedString): String = ws.self
+ implicit def wrapString(s: String): WrappedString = if (s ne null) new WrappedString(s) else null
+ implicit def unwrapString(ws: WrappedString): String = if (ws ne null) ws.self else null
implicit def fallbackStringCanBuildFrom[T]: CanBuildFrom[String, T, collection.immutable.IndexedSeq[T]] =
new CanBuildFrom[String, T, collection.immutable.IndexedSeq[T]] {
diff --git a/src/library/scala/NotDefinedError.scala b/src/library/scala/NotDefinedError.scala
index c1939a4e9a..a47613fb9a 100644
--- a/src/library/scala/NotDefinedError.scala
+++ b/src/library/scala/NotDefinedError.scala
@@ -14,4 +14,5 @@ package scala
/**
* @since 2.0
*/
+@deprecated("Use a custom Error class instead")
final class NotDefinedError(msg: String) extends Error("not defined: " + msg)
diff --git a/src/library/scala/Option.scala b/src/library/scala/Option.scala
index 8511fa78a5..f2da220775 100644
--- a/src/library/scala/Option.scala
+++ b/src/library/scala/Option.scala
@@ -35,6 +35,7 @@ object Option
* @version 1.1, 16/01/2007
*/
sealed abstract class Option[+A] extends Product {
+ self =>
/** True if the option is the <code>None</code> value, false otherwise.
*/
@@ -45,7 +46,7 @@ sealed abstract class Option[+A] extends Product {
def isDefined: Boolean = !isEmpty
/** get the value of this option.
- * @requires that the option is nonEmpty.
+ * @note The option must be nonEmpty.
* @throws Predef.NoSuchElementException if the option is empty.
*/
def get: A
@@ -89,6 +90,22 @@ sealed abstract class Option[+A] extends Product {
def filter(p: A => Boolean): Option[A] =
if (isEmpty || p(this.get)) this else None
+ /** Necessary to keep Option from being implicitly converted to
+ * Iterable in for comprehensions.
+ */
+ def withFilter(p: A => Boolean): WithFilter = new WithFilter(p)
+
+ /** We need a whole WithFilter class to honor the "doesn't create a new
+ * collection" contract even though it seems unlikely to matter much in a
+ * collection with max size 1.
+ */
+ class WithFilter(p: A => Boolean) {
+ def map[B](f: A => B): Option[B] = self filter p map f
+ def flatMap[B](f: A => Option[B]): Option[B] = self filter p flatMap f
+ def foreach[U](f: A => U): Unit = self filter p foreach f
+ def withFilter(q: A => Boolean): WithFilter = new WithFilter(x => p(x) && q(x))
+ }
+
/** If the option is nonempty, p(value), otherwise false.
*
* @param p the predicate to test
@@ -110,7 +127,7 @@ sealed abstract class Option[+A] extends Product {
*
* @param pf the partial function.
*/
- def partialMap[B](pf: PartialFunction[A, B]): Option[B] =
+ def collect[B](pf: PartialFunction[A, B]): Option[B] =
if (!isEmpty && pf.isDefinedAt(this.get)) Some(pf(this.get)) else None
/** If the option is nonempty return it,
diff --git a/src/library/scala/Predef.scala b/src/library/scala/Predef.scala
index 5684c91aaa..2037705bab 100644
--- a/src/library/scala/Predef.scala
+++ b/src/library/scala/Predef.scala
@@ -14,6 +14,8 @@ package scala
import collection.immutable.StringOps
import collection.mutable.ArrayOps
import collection.generic.CanBuildFrom
+import annotation.elidable
+import annotation.elidable.ASSERTION
/** The <code>Predef</code> object provides definitions that are
* accessible in all Scala compilation units without explicit
@@ -53,22 +55,6 @@ object Predef extends LowPriorityImplicits {
@inline def locally[T](x: T): T = x
- // hashcode -----------------------------------------------------------
-
- @inline def hash(x: Any): Int =
- if (x.isInstanceOf[Number]) runtime.BoxesRunTime.hashFromNumber(x.asInstanceOf[Number])
- else x.hashCode
-
- @inline def hash(x: Number): Int =
- runtime.BoxesRunTime.hashFromNumber(x)
-
- @inline def hash(x: java.lang.Long): Int = {
- val iv = x.intValue
- if (iv == x.longValue) iv else x.hashCode
- }
-
- @inline def hash(x: Int): Int = x
-
// errors and asserts -------------------------------------------------
def error(message: String): Nothing = throw new RuntimeException(message)
@@ -80,38 +66,82 @@ object Predef extends LowPriorityImplicits {
throw new Throwable()
}
- import annotation.elidable
- import annotation.elidable.ASSERTION
-
+ /** Tests an expression, throwing an AssertionError if false.
+ * Calls to this method will not be generated if -Xelide-below
+ * is at least ASSERTION.
+ *
+ * @see elidable
+ * @param p the expression to test
+ */
@elidable(ASSERTION)
def assert(assertion: Boolean) {
if (!assertion)
throw new java.lang.AssertionError("assertion failed")
}
+ /** Tests an expression, throwing an AssertionError if false.
+ * Calls to this method will not be generated if -Xelide-below
+ * is at least ASSERTION.
+ *
+ * @see elidable
+ * @param p the expression to test
+ * @param msg a String to include in the failure message
+ */
@elidable(ASSERTION)
def assert(assertion: Boolean, message: => Any) {
if (!assertion)
throw new java.lang.AssertionError("assertion failed: "+ message)
}
+ /** Tests an expression, throwing an AssertionError if false.
+ * This method differs from assert only in the intent expressed:
+ * assert contains a predicate which needs to be proven, while
+ * assume contains an axiom for a static checker. Calls to this method
+ * will not be generated if -Xelide-below is at least ASSERTION.
+ *
+ * @see elidable
+ * @param p the expression to test
+ */
@elidable(ASSERTION)
def assume(assumption: Boolean) {
if (!assumption)
throw new java.lang.AssertionError("assumption failed")
}
+ /** Tests an expression, throwing an AssertionError if false.
+ * This method differs from assert only in the intent expressed:
+ * assert contains a predicate which needs to be proven, while
+ * assume contains an axiom for a static checker. Calls to this method
+ * will not be generated if -Xelide-below is at least ASSERTION.
+ *
+ * @see elidable
+ * @param p the expression to test
+ * @param msg a String to include in the failure message
+ */
@elidable(ASSERTION)
def assume(assumption: Boolean, message: => Any) {
if (!assumption)
throw new java.lang.AssertionError("assumption failed: "+ message)
}
+ /** Tests an expression, throwing an IllegalArgumentException if false.
+ * This method is similar to assert, but blames the caller of the method
+ * for violating the condition.
+ *
+ * @param p the expression to test
+ */
def require(requirement: Boolean) {
if (!requirement)
throw new IllegalArgumentException("requirement failed")
}
+ /** Tests an expression, throwing an IllegalArgumentException if false.
+ * This method is similar to assert, but blames the caller of the method
+ * for violating the condition.
+ *
+ * @param p the expression to test
+ * @param msg a String to include in the failure message
+ */
def require(requirement: Boolean, message: => Any) {
if (!requirement)
throw new IllegalArgumentException("requirement failed: "+ message)
@@ -150,7 +180,7 @@ object Predef extends LowPriorityImplicits {
def print(x: Any) = Console.print(x)
def println() = Console.println()
def println(x: Any) = Console.println(x)
- def printf(text: String, xs: Any*) = Console.printf(format(text, xs: _*))
+ def printf(text: String, xs: Any*) = Console.print(format(text, xs: _*))
def format(text: String, xs: Any*) = augmentString(text).format(xs: _*)
def readLine(): String = Console.readLine()
@@ -291,7 +321,7 @@ object Predef extends LowPriorityImplicits {
implicit def conformsOrViewsAs[A <% B, B]: A <%< B = new (A <%< B) {def apply(x: A) = x}
}
- /** A type for which there is aways an implicit value.
+ /** A type for which there is always an implicit value.
* @see fallbackCanBuildFrom in Array.scala
*/
class DummyImplicit
diff --git a/src/library/scala/Product.scala b/src/library/scala/Product.scala
index 8521cf2437..a0503cfe4c 100644
--- a/src/library/scala/Product.scala
+++ b/src/library/scala/Product.scala
@@ -20,7 +20,7 @@ package scala
*/
trait Product extends Equals {
- /** for a product <code>A(x_1,...,x_k)</code>, returns <code>x_(n+1)</code>
+ /** For a product <code>A(x_1,...,x_k)</code>, returns <code>x_(n+1)</code>
* for <code>0 &lt;= n &lt; k</code>
*
* @param n the index of the element to return
@@ -29,6 +29,21 @@ trait Product extends Equals {
*/
def productElement(n: Int): Any
+ // !!! This will be disabled pending reimplementation, but it can't be removed
+ // until starr forgets about it.
+
+ /** Returns the name of the field at the given index from the definition
+ * of the class.
+ *
+ * @param n the index of the element name to return
+ * @throws NoSuchElementException if the name data is unavailable for any reason
+ * @throws IndexOutOfBoundsException if the index is out of range
+ * @return a String representing the field name
+ */
+ def productElementName(n: Int): String =
+ // the method implementation is synthetic - if it is not generated we always throw.
+ throw new NoSuchElementException()
+
/** return k for a product <code>A(x_1,...,x_k)</code>
*/
def productArity: Int
diff --git a/src/library/scala/Tuple2.scala b/src/library/scala/Tuple2.scala
index 2a4797ab5a..8c4e5973c5 100644
--- a/src/library/scala/Tuple2.scala
+++ b/src/library/scala/Tuple2.scala
@@ -42,6 +42,11 @@ case class Tuple2[+T1, +T2](_1:T1,_2:T2)
b1.result
}
+ /** Wraps a tuple in a `Zipped`, which supports 2-ary generalisations of map, flatMap, filter,...
+ *
+ * @see Zipped
+ * $willNotTerminateInf
+ */
def zipped[Repr1, El1, Repr2, El2](implicit w1: T1 => TraversableLike[El1, Repr1], w2: T2 => IterableLike[El2, Repr2]): Zipped[Repr1, El1, Repr2, El2]
= new Zipped[Repr1, El1, Repr2, El2](_1, _2)
diff --git a/src/library/scala/Tuple3.scala b/src/library/scala/Tuple3.scala
index b70310db3f..a1fca95e4d 100644
--- a/src/library/scala/Tuple3.scala
+++ b/src/library/scala/Tuple3.scala
@@ -41,6 +41,11 @@ case class Tuple3[+T1, +T2, +T3](_1:T1,_2:T2,_3:T3)
b1.result
}
+ /** Wraps a tuple in a `Zipped`, which supports 3-ary generalisations of map, flatMap, filter,...
+ *
+ * @see Zipped
+ * $willNotTerminateInf
+ */
def zipped[Repr1, El1, Repr2, El2, Repr3, El3](implicit w1: T1 => TraversableLike[El1, Repr1],
w2: T2 => IterableLike[El2, Repr2],
w3: T3 => IterableLike[El3, Repr3]): Zipped[Repr1, El1, Repr2, El2, Repr3, El3]
diff --git a/src/library/scala/annotation/elidable.scala b/src/library/scala/annotation/elidable.scala
index 4f29c8f2ab..c75299e9fd 100644
--- a/src/library/scala/annotation/elidable.scala
+++ b/src/library/scala/annotation/elidable.scala
@@ -13,18 +13,18 @@ import java.util.logging.Level
/** An annotation for methods for which invocations might
* be removed in the generated code.
*
- * Behavior is influenced by passing -Xelide-level <arg>
+ * Behavior is influenced by passing -Xelide-below <arg>
* to scalac. Methods marked elidable will be omitted from
* generated code if the priority given the annotation is lower
* than to the command line argument. Examples:
- *
+ * {{{
* import annotation.elidable._
*
* @elidable(WARNING) def foo = log("foo")
* @elidable(FINE) def bar = log("bar")
*
- * scalac -Xelide-methods-below=1000
- *
+ * scalac -Xelide-below=1000
+ * }}}
* @since 2.8
*/
final class elidable(final val level: Int) extends StaticAnnotation {}
diff --git a/src/library/scala/annotation/migration.scala b/src/library/scala/annotation/migration.scala
new file mode 100644
index 0000000000..b0915cde34
--- /dev/null
+++ b/src/library/scala/annotation/migration.scala
@@ -0,0 +1,28 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2002-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+package scala.annotation
+
+/**
+ * An annotation that marks a member as having changed semantics
+ * between versions. This is intended for methods which for one
+ * reason or another retain the same name and type signature,
+ * but some aspect of their behavior is different. An illustrative
+ * examples is Stack.iterator, which reversed from LIFO to FIFO
+ * order between scala 2.7 and 2.8.
+ *
+ * The version numbers are to mark the scala major/minor release
+ * version where the change took place.
+ *
+ * @since 2.8
+ */
+private[scala] final class migration(
+ majorVersion: Int,
+ minorVersion: Int,
+ message: String)
+extends StaticAnnotation {}
diff --git a/src/library/scala/collection/BitSetLike.scala b/src/library/scala/collection/BitSetLike.scala
index aac731fec9..8476ede7b5 100644
--- a/src/library/scala/collection/BitSetLike.scala
+++ b/src/library/scala/collection/BitSetLike.scala
@@ -28,11 +28,10 @@ import mutable.StringBuilder
* @since 2.8
* @define coll bitset
* @define Coll BitSet
- * define bitsetinfo
+ * @define bitsetinfo
* Bitsets are sets of non-negative integers which are represented as
* variable-size arrays of bits packed into 64-bit words. The size of a bitset is
* determined by the largest number stored in it.
-
*/
trait BitSetLike[+This <: BitSetLike[This] with Set[Int]] extends SetLike[Int, This] { self =>
@@ -42,7 +41,7 @@ trait BitSetLike[+This <: BitSetLike[This] with Set[Int]] extends SetLike[Int, T
protected def nwords: Int
/** The words at index `idx', or 0L if outside the range of the set
- * @pre idx >= 0
+ * @note Requires `idx >= 0`
*/
protected def word(idx: Int): Long
diff --git a/src/library/scala/collection/IndexedSeq.scala b/src/library/scala/collection/IndexedSeq.scala
index 05141fb864..50a66e924c 100644
--- a/src/library/scala/collection/IndexedSeq.scala
+++ b/src/library/scala/collection/IndexedSeq.scala
@@ -14,15 +14,9 @@ package scala.collection
import generic._
import mutable.Builder
-/** <p>
- * Sequences that support O(1) element access and O(1) length computation.
- * </p>
- * <p>
- * This class does not add any methods to <code>Sequence</code> but
- * overrides several methods with optimized implementations.
- * </p>
+/** A base trait for indexed sequences.
+ * $indexedSeqInfo
*
- * @author Sean McDirmid
* @author Martin Odersky
* @version 2.8
* @since 2.8
diff --git a/src/library/scala/collection/IndexedSeqLike.scala b/src/library/scala/collection/IndexedSeqLike.scala
index 8164075629..ea6e1bb493 100644
--- a/src/library/scala/collection/IndexedSeqLike.scala
+++ b/src/library/scala/collection/IndexedSeqLike.scala
@@ -18,16 +18,23 @@ import scala.annotation.tailrec
/** A template trait for indexed sequences of type `IndexedSeq[A]`.
*
* $indexedSeqInfo
+ *
+ * This trait just implements `iterator` in terms of `apply` and `length`.
+ * However, see `IndexedSeqOptimized` for an implementation trait that overrides operations
+ * to make them run faster under the assumption of fast random access with `apply`.
+ *
* @author Sean McDirmid
* @author Martin Odersky
* @version 2.8
* @since 2.8
+ * @define Coll IndexedSeq
* @define indexedSeqInfo
* Indexed sequences support constant-time or near constant-time element
- * access and length computation.
+ * access and length computation. They are defined in terms of abstract methods
+ * `apply` fpor indexing and `length`.
*
- * Indexed sequences do not define any new methods wrt `Seq`. However, some `Seq` methods
- * are overridden with optimized implementations.
+ * Indexed sequences do not add any new methods wrt `Seq`, but promise
+ * efficient implementations of random access patterns.
*
* @tparam A the element type of the $coll
* @tparam Repr the type of the actual $coll containing the elements.
@@ -76,267 +83,7 @@ trait IndexedSeqLike[+A, +Repr] extends SeqLike[A, Repr] { self =>
override /*IterableLike*/
def iterator: Iterator[A] = new Elements(0, length)
-
- override /*IterableLike*/
- def isEmpty: Boolean = { length == 0 }
-
- override /*IterableLike*/
- def foreach[U](f: A => U): Unit = {
- var i = 0
- val len = length
- while (i < len) { f(this(i)); i += 1 }
- }
-
- override /*IterableLike*/
- def forall(p: A => Boolean): Boolean = prefixLength(p(_)) == length
-
- override /*IterableLike*/
- def exists(p: A => Boolean): Boolean = prefixLength(!p(_)) != length
-
- override /*IterableLike*/
- def find(p: A => Boolean): Option[A] = {
- val i = prefixLength(!p(_))
- if (i < length) Some(this(i)) else None
- }
/*
- override /*IterableLike*/
- def mapFind[B](f: A => Option[B]): Option[B] = {
- var i = 0
- var res: Option[B] = None
- val len = length
- while (res.isEmpty && i < len) {
- res = f(this(i))
- i += 1
- }
- res
- }
-*/
- @tailrec
- private def foldl[B](start: Int, end: Int, z: B, op: (B, A) => B): B =
- if (start == end) z
- else foldl(start + 1, end, op(z, this(start)), op)
-
- @tailrec
- private def foldr[B](start: Int, end: Int, z: B, op: (A, B) => B): B =
- if (start == end) z
- else foldr(start, end - 1, op(this(end - 1), z), op)
-
- override /*TraversableLike*/
- def foldLeft[B](z: B)(op: (B, A) => B): B =
- foldl(0, length, z, op)
-
- override /*IterableLike*/
- def foldRight[B](z: B)(op: (A, B) => B): B =
- foldr(0, length, z, op)
-
- override /*TraversableLike*/
- def reduceLeft[B >: A](op: (B, A) => B): B =
- if (length > 0) foldl(1, length, this(0), op) else super.reduceLeft(op)
-
- override /*IterableLike*/
- def reduceRight[B >: A](op: (A, B) => B): B =
- if (length > 0) foldr(0, length - 1, this(length - 1), op) else super.reduceRight(op)
-
- override /*IterableLike*/
- def zip[A1 >: A, B, That](that: Iterable[B])(implicit bf: CanBuildFrom[Repr, (A1, B), That]): That = that match {
- case that: IndexedSeq[_] =>
- val b = bf(repr)
- var i = 0
- val len = this.length min that.length
- b.sizeHint(len)
- while (i < len) {
- b += ((this(i), that(i).asInstanceOf[B]))
- i += 1
- }
- b.result
- case _ =>
- super.zip[A1, B, That](that)(bf)
- }
-
- override /*IterableLike*/
- def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
- val b = bf(repr)
- val len = length
- b.sizeHint(len)
- var i = 0
- while (i < len) {
- b += ((this(i), i))
- i += 1
- }
- b.result
- }
-
- override /*IterableLike*/
- def slice(from: Int, until: Int): Repr = {
- var i = from max 0
- val end = until min length
- val b = newBuilder
- b.sizeHint(end - i)
- while (i < end) {
- b += this(i)
- i += 1
- }
- b.result
- }
-
- override /*IterableLike*/
- def head: A = if (isEmpty) super.head else this(0)
-
- override /*TraversableLike*/
- def tail: Repr = if (isEmpty) super.tail else slice(1, length)
-
- override /*TraversableLike*/
- def last: A = if (length > 0) this(length - 1) else super.last
-
- override /*IterableLike*/
- def init: Repr = if (length > 0) slice(0, length - 1) else super.init
-
- override /*TraversableLike*/
- def take(n: Int): Repr = slice(0, n)
-
- override /*TraversableLike*/
- def drop(n: Int): Repr = slice(n, length)
-
- override /*IterableLike*/
- def takeRight(n: Int): Repr = slice(length - n, length)
-
- override /*IterableLike*/
- def dropRight(n: Int): Repr = slice(0, length - n)
-
- override /*TraversableLike*/
- def splitAt(n: Int): (Repr, Repr) = (take(n), drop(n))
-
- override /*IterableLike*/
- def takeWhile(p: A => Boolean): Repr = take(prefixLength(p))
-
- override /*TraversableLike*/
- def dropWhile(p: A => Boolean): Repr = drop(prefixLength(p))
-
- override /*TraversableLike*/
- def span(p: A => Boolean): (Repr, Repr) = splitAt(prefixLength(p))
-
- override /*IterableLike*/
- def sameElements[B >: A](that: Iterable[B]): Boolean = that match {
- case that: IndexedSeq[_] =>
- val len = length
- len == that.length && {
- var i = 0
- while (i < len && this(i) == that(i)) i += 1
- i == len
- }
- case _ =>
- super.sameElements(that)
- }
-
- override /*IterableLike*/
- def copyToArray[B >: A](xs: Array[B], start: Int, len: Int) {
- var i = 0
- var j = start
- val end = length min len min (xs.length - start)
- while (i < end) {
- xs(j) = this(i)
- i += 1
- j += 1
- }
- }
-
-
- // Overridden methods from Seq
-
- override /*SeqLike*/
- def lengthCompare(len: Int): Int = length - len
-
- override /*SeqLike*/
- def segmentLength(p: A => Boolean, from: Int): Int = {
- val start = from
- val len = length
- var i = start
- while (i < len && p(this(i))) i += 1
- i - start
- }
-
- private def negLength(n: Int) = if (n == length) -1 else n
-
- override /*SeqLike*/
- def indexWhere(p: A => Boolean, from: Int): Int = {
- val start = from max 0
- negLength(start + segmentLength(!p(_), start))
- }
-
- override /*SeqLike*/
- def lastIndexWhere(p: A => Boolean, end: Int): Int = {
- var i = end
- while (i >= 0 && !p(this(i))) i -= 1
- i
- }
-
- override /*SeqLike*/
- def reverse: Repr = {
- val b = newBuilder
- b.sizeHint(length)
- var i = length
- while (0 < i) {
- i -= 1
- b += this(i)
- }
- b.result
- }
-
- override /*SeqLike*/
- def reverseIterator: Iterator[A] = new Iterator[A] {
- private var i = self.length
- def hasNext: Boolean = 0 < i
- def next: A =
- if (0 < i) {
- i -= 1
- self(i)
- } else Iterator.empty.next
- }
-
- override /*SeqLike*/
- def startsWith[B](that: Seq[B], offset: Int): Boolean = that match {
- case that: IndexedSeq[_] =>
- var i = offset
- var j = 0
- val thisLen = length
- val thatLen = that.length
- while (i < thisLen && j < thatLen && this(i) == that(j)) {
- i += 1
- j += 1
- }
- j == thatLen
- case _ =>
- var i = offset
- val thisLen = length
- val thatElems = that.iterator
- while (i < thisLen && thatElems.hasNext) {
- if (this(i) != thatElems.next())
- return false
-
- i += 1
- }
- !thatElems.hasNext
- }
-
- override /*SeqLike*/
- def endsWith[B](that: Seq[B]): Boolean = that match {
- case that: IndexedSeq[_] =>
- var i = length - 1
- var j = that.length - 1
-
- (j <= i) && {
- while (j >= 0) {
- if (this(i) != that(j))
- return false
- i -= 1
- j -= 1
- }
- true
- }
- case _ =>
- super.endsWith(that)
- }
-
override /*SeqLike*/
def view = new IndexedSeqView[A, Repr] {
protected lazy val underlying = self.repr
@@ -347,5 +94,6 @@ trait IndexedSeqLike[+A, +Repr] extends SeqLike[A, Repr] { self =>
override /*SeqLike*/
def view(from: Int, until: Int) = view.slice(from, until)
+*/
}
diff --git a/src/library/scala/collection/IndexedSeqOptimized.scala b/src/library/scala/collection/IndexedSeqOptimized.scala
new file mode 100755
index 0000000000..12b39c8b83
--- /dev/null
+++ b/src/library/scala/collection/IndexedSeqOptimized.scala
@@ -0,0 +1,293 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2006-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+// $Id: IndexedSeqLike.scala 20129 2009-12-14 17:12:17Z odersky $
+
+
+package scala.collection
+
+import generic._
+import mutable.ArrayBuffer
+import scala.annotation.tailrec
+
+/** A template trait for indexed sequences of type `IndexedSeq[A]` which optimizes
+ * the implementation of several methods under the assumption of fast random access.
+ *
+ * $indexedSeqInfo
+ * @author Martin Odersky
+ * @version 2.8
+ * @since 2.8
+ *
+ * @tparam A the element type of the $coll
+ * @tparam Repr the type of the actual $coll containing the elements.
+ * @define willNotTerminateInf
+ * @define mayNotTerminateInf
+ */
+trait IndexedSeqOptimized[+A, +Repr] extends IndexedSeqLike[A, Repr] { self =>
+
+ override /*IterableLike*/
+ def isEmpty: Boolean = { length == 0 }
+
+ override /*IterableLike*/
+ def foreach[U](f: A => U): Unit = {
+ var i = 0
+ val len = length
+ while (i < len) { f(this(i)); i += 1 }
+ }
+
+ override /*IterableLike*/
+ def forall(p: A => Boolean): Boolean = prefixLength(p(_)) == length
+
+ override /*IterableLike*/
+ def exists(p: A => Boolean): Boolean = prefixLength(!p(_)) != length
+
+ override /*IterableLike*/
+ def find(p: A => Boolean): Option[A] = {
+ val i = prefixLength(!p(_))
+ if (i < length) Some(this(i)) else None
+ }
+/*
+ override /*IterableLike*/
+ def mapFind[B](f: A => Option[B]): Option[B] = {
+ var i = 0
+ var res: Option[B] = None
+ val len = length
+ while (res.isEmpty && i < len) {
+ res = f(this(i))
+ i += 1
+ }
+ res
+ }
+*/
+ @tailrec
+ private def foldl[B](start: Int, end: Int, z: B, op: (B, A) => B): B =
+ if (start == end) z
+ else foldl(start + 1, end, op(z, this(start)), op)
+
+ @tailrec
+ private def foldr[B](start: Int, end: Int, z: B, op: (A, B) => B): B =
+ if (start == end) z
+ else foldr(start, end - 1, op(this(end - 1), z), op)
+
+ override /*TraversableLike*/
+ def foldLeft[B](z: B)(op: (B, A) => B): B =
+ foldl(0, length, z, op)
+
+ override /*IterableLike*/
+ def foldRight[B](z: B)(op: (A, B) => B): B =
+ foldr(0, length, z, op)
+
+ override /*TraversableLike*/
+ def reduceLeft[B >: A](op: (B, A) => B): B =
+ if (length > 0) foldl(1, length, this(0), op) else super.reduceLeft(op)
+
+ override /*IterableLike*/
+ def reduceRight[B >: A](op: (A, B) => B): B =
+ if (length > 0) foldr(0, length - 1, this(length - 1), op) else super.reduceRight(op)
+
+ override /*IterableLike*/
+ def zip[A1 >: A, B, That](that: Iterable[B])(implicit bf: CanBuildFrom[Repr, (A1, B), That]): That = that match {
+ case that: IndexedSeq[_] =>
+ val b = bf(repr)
+ var i = 0
+ val len = this.length min that.length
+ b.sizeHint(len)
+ while (i < len) {
+ b += ((this(i), that(i).asInstanceOf[B]))
+ i += 1
+ }
+ b.result
+ case _ =>
+ super.zip[A1, B, That](that)(bf)
+ }
+
+ override /*IterableLike*/
+ def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
+ val b = bf(repr)
+ val len = length
+ b.sizeHint(len)
+ var i = 0
+ while (i < len) {
+ b += ((this(i), i))
+ i += 1
+ }
+ b.result
+ }
+
+ override /*IterableLike*/
+ def slice(from: Int, until: Int): Repr = {
+ var i = from max 0
+ val end = until min length
+ val b = newBuilder
+ b.sizeHint(end - i)
+ while (i < end) {
+ b += this(i)
+ i += 1
+ }
+ b.result
+ }
+
+ override /*IterableLike*/
+ def head: A = if (isEmpty) super.head else this(0)
+
+ override /*TraversableLike*/
+ def tail: Repr = if (isEmpty) super.tail else slice(1, length)
+
+ override /*TraversableLike*/
+ def last: A = if (length > 0) this(length - 1) else super.last
+
+ override /*IterableLike*/
+ def init: Repr = if (length > 0) slice(0, length - 1) else super.init
+
+ override /*TraversableLike*/
+ def take(n: Int): Repr = slice(0, n)
+
+ override /*TraversableLike*/
+ def drop(n: Int): Repr = slice(n, length)
+
+ override /*IterableLike*/
+ def takeRight(n: Int): Repr = slice(length - n, length)
+
+ override /*IterableLike*/
+ def dropRight(n: Int): Repr = slice(0, length - n)
+
+ override /*TraversableLike*/
+ def splitAt(n: Int): (Repr, Repr) = (take(n), drop(n))
+
+ override /*IterableLike*/
+ def takeWhile(p: A => Boolean): Repr = take(prefixLength(p))
+
+ override /*TraversableLike*/
+ def dropWhile(p: A => Boolean): Repr = drop(prefixLength(p))
+
+ override /*TraversableLike*/
+ def span(p: A => Boolean): (Repr, Repr) = splitAt(prefixLength(p))
+
+ override /*IterableLike*/
+ def sameElements[B >: A](that: Iterable[B]): Boolean = that match {
+ case that: IndexedSeq[_] =>
+ val len = length
+ len == that.length && {
+ var i = 0
+ while (i < len && this(i) == that(i)) i += 1
+ i == len
+ }
+ case _ =>
+ super.sameElements(that)
+ }
+
+ override /*IterableLike*/
+ def copyToArray[B >: A](xs: Array[B], start: Int, len: Int) {
+ var i = 0
+ var j = start
+ val end = length min len min (xs.length - start)
+ while (i < end) {
+ xs(j) = this(i)
+ i += 1
+ j += 1
+ }
+ }
+
+
+ // Overridden methods from Seq
+
+ override /*SeqLike*/
+ def lengthCompare(len: Int): Int = length - len
+
+ override /*SeqLike*/
+ def segmentLength(p: A => Boolean, from: Int): Int = {
+ val start = from
+ val len = length
+ var i = start
+ while (i < len && p(this(i))) i += 1
+ i - start
+ }
+
+ private def negLength(n: Int) = if (n == length) -1 else n
+
+ override /*SeqLike*/
+ def indexWhere(p: A => Boolean, from: Int): Int = {
+ val start = from max 0
+ negLength(start + segmentLength(!p(_), start))
+ }
+
+ override /*SeqLike*/
+ def lastIndexWhere(p: A => Boolean, end: Int): Int = {
+ var i = end
+ while (i >= 0 && !p(this(i))) i -= 1
+ i
+ }
+
+ override /*SeqLike*/
+ def reverse: Repr = {
+ val b = newBuilder
+ b.sizeHint(length)
+ var i = length
+ while (0 < i) {
+ i -= 1
+ b += this(i)
+ }
+ b.result
+ }
+
+ override /*SeqLike*/
+ def reverseIterator: Iterator[A] = new Iterator[A] {
+ private var i = self.length
+ def hasNext: Boolean = 0 < i
+ def next: A =
+ if (0 < i) {
+ i -= 1
+ self(i)
+ } else Iterator.empty.next
+ }
+
+ override /*SeqLike*/
+ def startsWith[B](that: Seq[B], offset: Int): Boolean = that match {
+ case that: IndexedSeq[_] =>
+ var i = offset
+ var j = 0
+ val thisLen = length
+ val thatLen = that.length
+ while (i < thisLen && j < thatLen && this(i) == that(j)) {
+ i += 1
+ j += 1
+ }
+ j == thatLen
+ case _ =>
+ var i = offset
+ val thisLen = length
+ val thatElems = that.iterator
+ while (i < thisLen && thatElems.hasNext) {
+ if (this(i) != thatElems.next())
+ return false
+
+ i += 1
+ }
+ !thatElems.hasNext
+ }
+
+ override /*SeqLike*/
+ def endsWith[B](that: Seq[B]): Boolean = that match {
+ case that: IndexedSeq[_] =>
+ var i = length - 1
+ var j = that.length - 1
+
+ (j <= i) && {
+ while (j >= 0) {
+ if (this(i) != that(j))
+ return false
+ i -= 1
+ j -= 1
+ }
+ true
+ }
+ case _ =>
+ super.endsWith(that)
+ }
+}
+
diff --git a/src/library/scala/collection/IndexedSeqView.scala b/src/library/scala/collection/IndexedSeqView.scala
deleted file mode 100644
index 72f3374e94..0000000000
--- a/src/library/scala/collection/IndexedSeqView.scala
+++ /dev/null
@@ -1,38 +0,0 @@
-/* __ *\
-** ________ ___ / / ___ Scala API **
-** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
-** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
-** /____/\___/_/ |_/____/_/ | | **
-** |/ **
-\* */
-
-// $Id$
-
-
-package scala.collection
-
-import TraversableView.NoBuilder
-import generic._
-
-/** A non-strict projection of an iterable.
- *
- * @author Sean McDirmid
- * @author Martin Odersky
- * @version 2.8
- * @since 2.8
- */
-trait IndexedSeqView[+A, +Coll] extends IndexedSeqViewLike[A, Coll, IndexedSeqView[A, Coll]]
-
-object IndexedSeqView {
- type Coll = TraversableView[_, C] forSome {type C <: Traversable[_]}
- implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, IndexedSeqView[A, IndexedSeq[_]]] =
- new CanBuildFrom[Coll, A, IndexedSeqView[A, IndexedSeq[_]]] {
- def apply(from: Coll) = new NoBuilder
- def apply() = new NoBuilder
- }
- implicit def arrCanBuildFrom[A]: CanBuildFrom[TraversableView[_, Array[_]], A, IndexedSeqView[A, Array[A]]] =
- new CanBuildFrom[TraversableView[_, Array[_]], A, IndexedSeqView[A, Array[A]]] {
- def apply(from: TraversableView[_, Array[_]]) = new NoBuilder
- def apply() = new NoBuilder
- }
-}
diff --git a/src/library/scala/collection/IndexedSeqViewLike.scala b/src/library/scala/collection/IndexedSeqViewLike.scala
deleted file mode 100644
index 07f63ad2b0..0000000000
--- a/src/library/scala/collection/IndexedSeqViewLike.scala
+++ /dev/null
@@ -1,113 +0,0 @@
-/* __ *\
-** ________ ___ / / ___ Scala API **
-** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
-** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
-** /____/\___/_/ |_/____/_/ | | **
-** |/ **
-\* */
-
-// $Id: Seq.scala 16092 2008-09-12 10:37:06Z nielsen $
-
-
-package scala.collection
-
-import generic._
-import TraversableView.NoBuilder
-
-/** A template trait for a non-strict view of a IndexedSeq.
- *
- * @author Sean McDirmid
- * @author Martin Odersky
- * @version 2.8
- * @since 2.8
- */
-trait IndexedSeqViewLike[+A,
- +Coll,
- +This <: IndexedSeqView[A, Coll] with IndexedSeqViewLike[A, Coll, This]]
- extends IndexedSeq[A] with IndexedSeqLike[A, This] with SeqView[A, Coll] with SeqViewLike[A, Coll, This]
-{ self =>
-
- trait Transformed[+B] extends IndexedSeqView[B, Coll] with super.Transformed[B]
-
- trait Sliced extends Transformed[A] with super.Sliced {
- /** Override to use IndexedSeq's foreach; todo: see whether this is really faster */
- override def foreach[U](f: A => U) = super[Transformed].foreach(f)
- }
-
- trait Mapped[B] extends Transformed[B] with super.Mapped[B] {
- override def foreach[U](f: B => U) = super[Transformed].foreach(f)
- }
-
- trait FlatMapped[B] extends Transformed[B] with super.FlatMapped[B] {
- override def foreach[U](f: B => U) = super[Transformed].foreach(f)
- }
-
- trait Appended[B >: A] extends Transformed[B] with super.Appended[B] {
- override def foreach[U](f: B => U) = super[Transformed].foreach(f)
- }
-
- trait Filtered extends Transformed[A] with super.Filtered {
- override def foreach[U](f: A => U) = super[Transformed].foreach(f)
- }
-
- trait TakenWhile extends Transformed[A] with super.TakenWhile {
- override def foreach[U](f: A => U) = super[Transformed].foreach(f)
- }
-
- trait DroppedWhile extends Transformed[A] with super.DroppedWhile {
- override def foreach[U](f: A => U) = super[Transformed].foreach(f)
- }
-
- trait Reversed extends Transformed[A] with super.Reversed {
- override def foreach[U](f: A => U) = super[Transformed].foreach(f)
- }
-
- trait Patched[B >: A] extends Transformed[B] with super.Patched[B] {
- override def foreach[U](f: B => U) = super[Transformed].foreach(f)
- }
-
- trait Zipped[B] extends Transformed[(A, B)] {
- protected[this] val other: Iterable[B]
- /** Have to be careful here - other may be an infinite sequence. */
- def length =
- if (other.hasDefiniteSize) self.length min other.size
- else other take self.length size
-
- def apply(idx: Int): (A, B) = (self.apply(idx), other.iterator drop idx next)
- override def stringPrefix = self.stringPrefix+"Z"
- }
-
- trait ZippedAll[A1 >: A, B] extends Transformed[(A1, B)] {
- protected[this] val other: Iterable[B]
- val thisElem: A1
- val thatElem: B
- override def iterator: Iterator[(A1, B)] =
- self.iterator.zipAll(other.iterator, thisElem, thatElem)
-
- def length = self.length max other.size
- def apply(idx: Int): (A1, B) = {
- val z1 = if (idx < self.length) self.apply(idx) else thisElem
- val z2 = if (idx < other.size) other drop idx head else thatElem
- (z1, z2)
- }
- override def stringPrefix = self.stringPrefix+"Z"
- }
-
- /** Boilerplate method, to override in each subclass
- * This method could be eliminated if Scala had virtual classes
- */
- protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
- protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
- protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
- protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
- protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
- protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
- protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
- protected override def newZipped[B](that: Iterable[B]): Transformed[(A, B)] = new Zipped[B] { val other = that }
- protected override def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] = new ZippedAll[A1, B] { val other = that; val thisElem = _thisElem; val thatElem = _thatElem }
- protected override def newReversed: Transformed[A] = new Reversed { }
- protected override def newPatched[B >: A](_from: Int, _patch: Seq[B], _replaced: Int): Transformed[B] = new Patched[B] {
- val from = _from; val patch = _patch; val replaced = _replaced
- }
- override def stringPrefix = "IndexedSeqView"
-}
diff --git a/src/library/scala/collection/IterableLike.scala b/src/library/scala/collection/IterableLike.scala
index 8446988821..6f88c72ffd 100644
--- a/src/library/scala/collection/IterableLike.scala
+++ b/src/library/scala/collection/IterableLike.scala
@@ -69,9 +69,6 @@ self =>
*/
def iterator: Iterator[A]
- @deprecated("use `iterator' instead")
- def elements = iterator
-
/** Applies a function `f` to all elements of this $coll.
*
* Note: this method underlies the implementation of most other bulk operations.
@@ -189,7 +186,7 @@ self =>
b.result
}
- /** Selects all elements except first ''n'' ones.
+ /** Selects all elements except last ''n'' ones.
* $orderDependent
*
* @param n The number of elements to take
@@ -367,6 +364,9 @@ self =>
override /*TraversableLike*/ def view(from: Int, until: Int) = view.slice(from, until)
+ @deprecated("use `iterator' instead")
+ def elements = iterator
+
@deprecated("use `head' instead") def first: A = head
/** `None` if iterable is empty.
diff --git a/src/library/scala/collection/IterableProxyLike.scala b/src/library/scala/collection/IterableProxyLike.scala
index 4400237486..fe148339b0 100644
--- a/src/library/scala/collection/IterableProxyLike.scala
+++ b/src/library/scala/collection/IterableProxyLike.scala
@@ -24,24 +24,20 @@ import mutable.Buffer
* @version 2.8
* @since 2.8
*/
-trait IterableProxyLike[+A, +This <: IterableLike[A, This] with Iterable[A]]
- extends IterableLike[A, This]
- with TraversableProxyLike[A, This]
+trait IterableProxyLike[+A, +Repr <: IterableLike[A, Repr] with Iterable[A]]
+ extends IterableLike[A, Repr]
+ with TraversableProxyLike[A, Repr]
{
override def iterator: Iterator[A] = self.iterator
- override def foreach[U](f: A => U): Unit = self.foreach(f)
- override def isEmpty: Boolean = self.isEmpty
- override def foldRight[B](z: B)(op: (A, B) => B): B = self.foldRight(z)(op)
- override def reduceRight[B >: A](op: (A, B) => B): B = self.reduceRight(op)
- override def toIterable: Iterable[A] = self.toIterable
- override def zip[A1 >: A, B, That](that: Iterable[B])(implicit bf: CanBuildFrom[This, (A1, B), That]): That = self.zip[A1, B, That](that)(bf)
- override def zipAll[B, A1 >: A, That](that: Iterable[B], thisElem: A1, thatElem: B)(implicit bf: CanBuildFrom[This, (A1, B), That]): That = self.zipAll(that, thisElem, thatElem)(bf)
- override def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[This, (A1, Int), That]): That = self.zipWithIndex(bf)
- override def head: A = self.head
- override def takeRight(n: Int): This = self.takeRight(n)
- override def dropRight(n: Int): This = self.dropRight(n)
+ override def grouped(size: Int): Iterator[Repr] = self.grouped(size)
+ override def sliding[B >: A](size: Int): Iterator[Repr] = self.sliding(size)
+ override def sliding[B >: A](size: Int, step: Int): Iterator[Repr] = self.sliding(size, step)
+ override def takeRight(n: Int): Repr = self.takeRight(n)
+ override def dropRight(n: Int): Repr = self.dropRight(n)
+ override def zip[A1 >: A, B, That](that: Iterable[B])(implicit bf: CanBuildFrom[Repr, (A1, B), That]): That = self.zip[A1, B, That](that)(bf)
+ override def zipAll[B, A1 >: A, That](that: Iterable[B], thisElem: A1, thatElem: B)(implicit bf: CanBuildFrom[Repr, (A1, B), That]): That = self.zipAll(that, thisElem, thatElem)(bf)
+ override def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = self.zipWithIndex(bf)
override def sameElements[B >: A](that: Iterable[B]): Boolean = self.sameElements(that)
- override def toStream: Stream[A] = self.toStream
override def view = self.view
override def view(from: Int, until: Int) = self.view(from, until)
}
diff --git a/src/library/scala/collection/IterableViewLike.scala b/src/library/scala/collection/IterableViewLike.scala
index 27323294c4..09f084d92c 100644
--- a/src/library/scala/collection/IterableViewLike.scala
+++ b/src/library/scala/collection/IterableViewLike.scala
@@ -29,6 +29,10 @@ extends Iterable[A] with IterableLike[A, This] with TraversableView[A, Coll] wit
trait Transformed[+B] extends IterableView[B, Coll] with super.Transformed[B]
+ trait Forced[B] extends Transformed[B] with super.Forced[B] {
+ override def iterator = forced.iterator
+ }
+
trait Sliced extends Transformed[A] with super.Sliced {
override def iterator = self.iterator slice (from, until)
}
@@ -96,6 +100,7 @@ extends Iterable[A] with IterableLike[A, This] with TraversableView[A, Coll] wit
/** Boilerplate method, to override in each subclass
* This method could be eliminated if Scala had virtual classes
*/
+ protected override def newForced[B](xs: => Seq[B]): Transformed[B] = new Forced[B] { val forced = xs }
protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
@@ -104,5 +109,11 @@ extends Iterable[A] with IterableLike[A, This] with TraversableView[A, Coll] wit
protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
+ override def grouped(size: Int): Iterator[This] =
+ self.iterator.grouped(size).map(xs => newForced(xs).asInstanceOf[This])
+
+ override def sliding[B >: A](size: Int, step: Int): Iterator[This] =
+ self.iterator.sliding(size).map(xs => newForced(xs).asInstanceOf[This])
+
override def stringPrefix = "IterableView"
}
diff --git a/src/library/scala/collection/Iterator.scala b/src/library/scala/collection/Iterator.scala
index de0ec5275f..701b24f300 100644
--- a/src/library/scala/collection/Iterator.scala
+++ b/src/library/scala/collection/Iterator.scala
@@ -13,7 +13,7 @@ package scala.collection
import mutable.{Buffer, ArrayBuffer, ListBuffer, StringBuilder}
import immutable.{List, Stream}
-import annotation.{ tailrec }
+import annotation.{ tailrec, migration }
/** The `Iterator` object provides various functions for
* creating specialized iterators.
@@ -135,7 +135,7 @@ object Iterator {
}
/** Creates an infinite-length iterator returning the results of evaluating
- * an expression. The epxression is recomputed for every element.
+ * an expression. The expression is recomputed for every element.
*
* @param elem the element computation.
* @return the iterator containing an infinite number of results of evaluating `elem`.
@@ -145,6 +145,27 @@ object Iterator {
def next = elem
}
+ /** With the advent of TraversableOnce, it can be useful to have a builder
+ * for Iterators so they can be treated uniformly along with the collections.
+ * See scala.util.Random.shuffle for an example.
+ */
+ class IteratorCanBuildFrom[A] extends generic.CanBuildFrom[Iterator[A], A, Iterator[A]] {
+ def newIterator = new ArrayBuffer[A] mapResult (_.iterator)
+
+ /** Creates a new builder on request of a collection.
+ * @param from the collection requesting the builder to be created.
+ * @return the result of invoking the `genericBuilder` method on `from`.
+ */
+ def apply(from: Iterator[A]) = newIterator
+
+ /** Creates a new builder from scratch
+ * @return the result of invoking the `newBuilder` method of this factory.
+ */
+ def apply() = newIterator
+ }
+
+ implicit def iteratorCanBuildFrom[T]: IteratorCanBuildFrom[T] = new IteratorCanBuildFrom[T]
+
/** A wrapper class for the `flatten` method that is added to
* class `Iterator` with implicit conversion
* @see iteratorIteratorWrapper.
@@ -233,7 +254,7 @@ object Iterator {
def next(): Int = { val j = i; i = step(i); j }
}
- /** Create an iterator that is the concantenation of all iterators
+ /** Create an iterator that is the concatenation of all iterators
* returned by a given iterator of iterators.
* @param its The iterator which returns on each call to next
* a new iterator whose elements are to be concatenated to the result.
@@ -265,7 +286,8 @@ import Iterator.empty
* @define mayNotTerminateInf
* Note: may not terminate for infinite iterators.
*/
-trait Iterator[+A] { self =>
+trait Iterator[+A] extends TraversableOnce[A] {
+ self =>
/** Tests whether this iterator can provide another element.
* @return `true` if a subsequent call to `next` will yield an element,
@@ -279,6 +301,22 @@ trait Iterator[+A] { self =>
*/
def next(): A
+ /** Tests whether this iterator is empty.
+ * @return `true` if hasNext is false, `false` otherwise.
+ */
+ def isEmpty: Boolean = !hasNext
+
+ /** Tests whether this Iterator can be repeatedly traversed.
+ * @return `false`
+ */
+ def isTraversableAgain = false
+
+ /** Tests whether this Iterator has a known size.
+ *
+ * @return `true` for empty Iterators, `false` otherwise.
+ */
+ def hasDefiniteSize = isEmpty
+
/** Selects first ''n'' values of this iterator.
* @param n the number of values to take
* @return an iterator producing only of the first `n` values of this iterator, or else the
@@ -319,8 +357,8 @@ trait Iterator[+A] { self =>
/** Creates a new iterator that maps all produced values of this iterator
* to new values using a transformation function.
* @param f the transformation function
- * @return a new iterator which transformes every value produced by this
- * iterator by applying the functon `f` to it.
+ * @return a new iterator which transforms every value produced by this
+ * iterator by applying the function `f` to it.
*/
def map[B](f: A => B): Iterator[B] = new Iterator[B] {
def hasNext = self.hasNext
@@ -328,7 +366,7 @@ trait Iterator[+A] { self =>
}
/** Concatenates this iterator with another.
- * @that the other iterator
+ * @param that the other iterator
* @return a new iterator that first yields the values produced by this
* iterator followed by the values produced by iterator `that`.
* @usecase def ++(that: => Iterator[A]): Iterator[A]
@@ -411,7 +449,11 @@ trait Iterator[+A] { self =>
* @return a new iterator which yields each value `x` produced by this iterator for
* which `pf` is defined the image `pf(x)`.
*/
- def partialMap[B](pf: PartialFunction[A, B]): Iterator[B] = {
+ @migration(2, 8,
+ "This collect implementation bears no relationship to the one before 2.8.\n"+
+ "The previous behavior can be reproduced with toSeq."
+ )
+ def collect[B](pf: PartialFunction[A, B]): Iterator[B] = {
val self = buffered
new Iterator[B] {
private def skip() = while (self.hasNext && !pf.isDefinedAt(self.head)) self.next()
@@ -667,12 +709,12 @@ trait Iterator[+A] { self =>
if (found) i else -1
}
- /** Returns the index of the first occurence of the specified
+ /** Returns the index of the first occurrence of the specified
* object in this iterable object.
* $mayNotTerminateInf
*
* @param elem element to search for.
- * @return the index of the first occurence of `elem` in the values produced by this iterator,
+ * @return the index of the first occurrence of `elem` in the values produced by this iterator,
* or -1 if such an element does not exist until the end of the iterator is reached.
*/
def indexOf[B >: A](elem: B): Int = {
@@ -688,131 +730,6 @@ trait Iterator[+A] { self =>
if (found) i else -1
}
- /** Applies a binary operator to a start value and all values produced by this iterator, going left to right.
- * $willNotTerminateInf
- * @param z the start value.
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive values produced by this iterator
- * going left to right with the start value `z` on the left:
- * {{{
- * op(...op(z, x,,1,,), x,,2,,, ..., x,,n,,)
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the values produced by this iterator.
- */
- def foldLeft[B](z: B)(op: (B, A) => B): B = {
- var acc = z
- while (hasNext) acc = op(acc, next())
- acc
- }
-
- /** Applies a binary operator to all values produced by this iterator and a start value, going right to left.
- * $willNotTerminateInf
- * @param z the start value.
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive values produced by this iterator
- * going right to left with the start value `z` on the right:
- * {{{
- * op(x,,1,,, op(x,,2,,, ... op(x,,n,,, z)...))
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the values produced by this iterator.
- */
- def foldRight[B](z: B)(op: (A, B) => B): B =
- if (hasNext) op(next(), foldRight(z)(op)) else z
-
- /** Applies a binary operator to a start value and all values produced by this iterator, going left to right.
- *
- * Note: `/:` is alternate syntax for `foldLeft`; `z /: it` is the same as `it foldLeft z`.
- * $willNotTerminateInf
- *
- * @param z the start value.
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive values produced by this iterator
- * going left to right with the start value `z` on the left:
- * {{{
- * op(...op(z, x,,1,,), x,,2,,, ..., x,,n,,)
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the values produced by this iterator.
- */
- def /:[B](z: B)(op: (B, A) => B): B = foldLeft(z)(op)
-
- /** Applies a binary operator to all values produced by this iterator and a start value, going right to left.
- * Note: `:\` is alternate syntax for `foldRight`; `it :\ z` is the same as `it foldRight z`.
- * $willNotTerminateInf
- * @param z the start value.
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive values produced by this iterator
- * going right to left with the start value `z` on the right:
- * {{{
- * op(x,,1,,, op(x,,2,,, ... op(x,,n,,, z)...))
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the values produced by this iterator.
- */
- def :\[B](z: B)(op: (A, B) => B): B = foldRight(z)(op)
-
- /** Applies a binary operator to all values produced by this iterator, going left to right.
- * $willNotTerminateInf
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive values produced by this iterator
- * going left to right:
- * {{{
- * op(...(op(x,,1,,, x,,2,,), ... ) , x,,n,,)
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the values produced by this iterator.
- * @throws `UnsupportedOperationException` if this iterator is empty.
- */
- def reduceLeft[B >: A](op: (B, A) => B): B = {
- if (hasNext) foldLeft[B](next())(op)
- else throw new UnsupportedOperationException("empty.reduceLeft")
- }
-
- /** Applies a binary operator to all values produced by this iterator, going right to left.
- * $willNotTerminateInf
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive values produced by this iterator
- * going right to left:
- * {{{
- * op(x,,1,,, op(x,,2,,, ..., op(x,,n-1,,, x,,n,,)...))
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the values produced by this iterator.
- * @throws `UnsupportedOperationException` if this iterator is empty.
- */
- def reduceRight[B >: A](op: (A, B) => B): B = {
- if (hasNext) foldRight[B](next())(op)
- else throw new UnsupportedOperationException("empty.reduceRight")
- }
-
- /** Optionally applies a binary operator to all values produced by this iterator, going left to right.
- * $willNotTerminateInf
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return an option value containing the result of `reduceLeft(op)` is this iterator is nonempty,
- * `None` otherwise.
- */
- def reduceLeftOption[B >: A](op: (B, A) => B): Option[B] = {
- if (!hasNext) None else Some(reduceLeft(op))
- }
-
- /** Optionally applies a binary operator to all values produced by this iterator, going right to left.
- * $willNotTerminateInf
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return an option value containing the result of `reduceRight(op)` is this iterator is nonempty,
- * `None` otherwise.
- */
- def reduceRightOption[B >: A](op: (A, B) => B): Option[B] = {
- if (!hasNext) None else Some(reduceRight(op))
- }
-
/** Creates a buffered iterator from this iterator.
* @see BufferedIterator
* @return a buffered iterator producing the same values as this iterator.
@@ -937,6 +854,8 @@ trait Iterator[+A] { self =>
if (!filled)
fill()
+ if (!filled)
+ throw new NoSuchElementException("next on empty iterator")
filled = false
buffer.toList
}
@@ -985,16 +904,11 @@ trait Iterator[+A] { self =>
*
* Note: The iterator is at its end after this method returns.
*/
- def length: Int = {
- var i = 0
- while (hasNext) {
- next(); i += 1
- }
- i
- }
+ def length: Int = this.size
/** Creates two new iterators that both iterate over the same elements
- * as this iterator (in the same order).
+ * as this iterator (in the same order). The duplicate iterators are
+ * considered equal if they are positioned at the same element.
*
* @return a pair of iterators
*/
@@ -1013,6 +927,14 @@ trait Iterator[+A] { self =>
e
} else gap.dequeue
}
+ // to verify partnerhood we use reference equality on gap because
+ // type testing does not discriminate based on origin.
+ private def compareGap(queue: scala.collection.mutable.Queue[A]) = gap eq queue
+ override def hashCode = gap.hashCode
+ override def equals(other: Any) = other match {
+ case x: Partner => x.compareGap(gap) && gap.isEmpty
+ case _ => super.equals(other)
+ }
}
(new Partner, new Partner)
}
@@ -1063,75 +985,7 @@ trait Iterator[+A] { self =>
}
}
- /** Copies values produced by this iterator to an array.
- * Fills the given array `xs` with values produced by this iterator, after skipping `start` values.
- * Copying will stop once either the end of the current iterator is reached,
- * or the end of the array is reached.
- *
- * $willNotTerminateInf
- *
- * @param xs the array to fill.
- * @param start the starting index.
- * @tparam B the type of the elements of the array.
- *
- * @usecase def copyToArray(xs: Array[A], start: Int, len: Int): Unit
- */
- def copyToArray[B >: A](xs: Array[B], start: Int): Unit =
- copyToArray(xs, start, xs.length - start)
-
- /** Copies values produced by this iterator to an array.
- * Fills the given array `xs` with values produced by this iterator.
- * Copying will stop once either the end of the current iterator is reached,
- * or the end of the array is reached.
- *
- * $willNotTerminateInf
- *
- * @param xs the array to fill.
- * @tparam B the type of the elements of the array.
- *
- * @usecase def copyToArray(xs: Array[A], start: Int, len: Int): Unit
- */
- def copyToArray[B >: A](xs: Array[B]): Unit = copyToArray(xs, 0, xs.length)
-
- /** Copies all values produced by this iterator to a buffer.
- * $willNotTerminateInf
- * @param dest The buffer to which elements are copied
- */
- def copyToBuffer[B >: A](dest: Buffer[B]) {
- while (hasNext) dest += next()
- }
-
- /** Traverses this iterator and returns all produced values in a list.
- * $willNotTerminateInf
- *
- * @return a list which contains all values produced by this iterator.
- */
- def toList: List[A] = {
- val res = new ListBuffer[A]
- while (hasNext) res += next
- res.toList
- }
-
- /** Lazily wraps a Stream around this iterator so its values are memoized.
- *
- * @return a Stream which can repeatedly produce all the values
- * produced by this iterator.
- */
- def toStream: Stream[A] =
- if (hasNext) Stream.cons(next, toStream) else Stream.empty
-
- /** Traverses this iterator and returns all produced values in a sequence.
- * $willNotTerminateInf
- *
- * @return a list which contains all values produced by this iterator.
- */
- def toSeq: Seq[A] = {
- val buffer = new ArrayBuffer[A]
- this copyToBuffer buffer
- buffer
- }
-
- /** Tests if another iterator produces the same valeus as this one.
+ /** Tests if another iterator produces the same values as this one.
* $willNotTerminateInf
* @param that the other iterator
* @return `true`, if both iterators produce the same elements in the same order, `false` otherwise.
@@ -1144,76 +998,8 @@ trait Iterator[+A] { self =>
!hasNext && !that.hasNext
}
- /** Displays all values produced by this iterator in a string using start, end, and separator strings.
- *
- * @param start the starting string.
- * @param sep the separator string.
- * @param end the ending string.
- * @return a string representation of this iterator. The resulting string
- * begins with the string `start` and ends with the string
- * `end`. Inside, the string representations (w.r.t. the method `toString`)
- * of all values produced by this iterator are separated by the string `sep`.
- */
- def mkString(start: String, sep: String, end: String): String = {
- val buf = new StringBuilder
- addString(buf, start, sep, end).toString
- }
-
- /** Displays all values produced by this iterator in a string using a separator string.
- *
- * @param sep the separator string.
- * @return a string representation of this iterator. In the resulting string
- * the string representations (w.r.t. the method `toString`)
- * of all values produced by this iterator are separated by the string `sep`.
- */
- def mkString(sep: String): String = mkString("", sep, "")
-
- /** Displays all values produced by this iterator in a string.
- * @return a string representation of this iterator. In the resulting string
- * the string representations (w.r.t. the method `toString`)
- * of all values produced by this iterator follow each other without any separator string.
- */
- def mkString: String = mkString("")
-
- /** Appends all values produced by this iterator to a string builder using start, end, and separator strings.
- * The written text begins with the string `start` and ends with the string
- * `end`. Inside, the string representations (w.r.t. the method `toString`)
- * of all values produced by this iterator are separated by the string `sep`.
- *
- * @param b the string builder to which elements are appended.
- * @param start the starting string.
- * @param sep the separator string.
- * @param end the ending string.
- * @return the string builder `b` to which elements were appended.
- */
- def addString(buf: StringBuilder, start: String, sep: String, end: String): StringBuilder = {
- buf.append(start)
- val elems = this
- if (elems.hasNext) buf.append(elems.next)
- while (elems.hasNext) {
- buf.append(sep); buf.append(elems.next)
- }
- buf.append(end)
- }
-
- /** Appends all values produced by this iterator to a string builder using a separator string.
- * The written text consists of the string representations (w.r.t. the method `toString`)
- * of all values produced by this iterator, separated by the string `sep`.
- *
- * @param b the string builder to which elements are appended.
- * @param sep the separator string.
- * @return the string builder `b` to which elements were appended.
- */
- def addString(buf: StringBuilder, sep: String): StringBuilder = addString(buf, "", sep, "")
-
- /** Appends all values produced by this iterator to a string builder.
- * The written text consists of the string representations (w.r.t. the method `toString`)
- * of all values produced by this iterator without any separator string.
- *
- * @param b the string builder to which elements are appended.
- * @return the string builder `b` to which elements were appended.
- */
- def addString(buf: StringBuilder): StringBuilder = addString(buf, "", "", "")
+ def toTraversable: Traversable[A] = toStream
+ def toIterator: Iterator[A] = self
/** Converts this iterator to a string.
* @return `"empty iterator"` or `"non-empty iterator"`, depending on whether or not the iterator is empty.
@@ -1230,13 +1016,6 @@ trait Iterator[+A] { self =>
@deprecated("use `indexWhere` instead")
def findIndexOf(p: A => Boolean): Int = indexWhere(p)
- /** Collect elements into a seq.
- *
- * @return a sequence which enumerates all elements of this iterator.
- */
- @deprecated("use toSeq instead")
- def collect: Seq[A] = toSeq
-
/** Returns a counted iterator from this iterator.
*/
@deprecated("use zipWithIndex in Iterator")
@@ -1254,7 +1033,7 @@ trait Iterator[+A] { self =>
* @param xs the array to fill.
* @param start the starting index.
* @param sz the maximum number of elements to be read.
- * @pre the array must be large enough to hold `sz` elements.
+ * @note the array must be large enough to hold `sz` elements.
*/
@deprecated("use copyToArray instead")
def readInto[B >: A](xs: Array[B], start: Int, sz: Int) {
diff --git a/src/library/scala/collection/JavaConversions.scala b/src/library/scala/collection/JavaConversions.scala
index 7af138067b..00f2d745af 100644
--- a/src/library/scala/collection/JavaConversions.scala
+++ b/src/library/scala/collection/JavaConversions.scala
@@ -40,7 +40,7 @@ package scala.collection
* <p>
* Note that no conversion is provided from <code>scala.collection.immutable.List</code>
* to <code>java.util.List</code>. Instead it is convertible to an immutable
- * <code>java.util.Collection</code> which provides size and interation
+ * <code>java.util.Collection</code> which provides size and interaction
* capabilities, but not access by index as would be provided by
* <code>java.util.List</code>.<br/>
* This is intentional: in combination the implementation of
@@ -497,9 +497,10 @@ object JavaConversions {
case class MutableMapWrapper[A, B](underlying : mutable.Map[A, B])(m : ClassManifest[A])
extends MutableMapWrapperLike[A, B](underlying)(m)
- abstract class JMapWrapperLike[A, B, +Repr <: mutable.MapLike[A, B, Repr] with mutable.Map[A, B]]
- (underlying: ju.Map[A, B])
+ trait JMapWrapperLike[A, B, +Repr <: mutable.MapLike[A, B, Repr] with mutable.Map[A, B]]
extends mutable.Map[A, B] with mutable.MapLike[A, B, Repr] {
+ def underlying: ju.Map[A, B]
+
override def size = underlying.size
def get(k : A) = {
@@ -538,8 +539,8 @@ object JavaConversions {
override def empty: Repr = null.asInstanceOf[Repr]
}
- case class JMapWrapper[A, B](underlying : ju.Map[A, B])
- extends JMapWrapperLike[A, B, JMapWrapper[A, B]](underlying) {
+ case class JMapWrapper[A, B](val underlying : ju.Map[A, B])
+ extends JMapWrapperLike[A, B, JMapWrapper[A, B]] {
override def empty = JMapWrapper(new ju.HashMap[A, B])
}
@@ -584,8 +585,8 @@ object JavaConversions {
}
- case class JConcurrentMapWrapper[A, B](underlying: juc.ConcurrentMap[A, B])
- extends JMapWrapperLike[A, B, JConcurrentMapWrapper[A, B]](underlying) with mutable.ConcurrentMap[A, B] {
+ case class JConcurrentMapWrapper[A, B](val underlying: juc.ConcurrentMap[A, B])
+ extends JMapWrapperLike[A, B, JConcurrentMapWrapper[A, B]] with mutable.ConcurrentMap[A, B] {
override def get(k: A) = {
val v = underlying.get(k)
if (v != null) Some(v)
diff --git a/src/library/scala/collection/LinearSeq.scala b/src/library/scala/collection/LinearSeq.scala
index 5862741530..1afb2fdb7f 100644
--- a/src/library/scala/collection/LinearSeq.scala
+++ b/src/library/scala/collection/LinearSeq.scala
@@ -14,18 +14,10 @@ package scala.collection
import generic._
import mutable.Builder
-/** <p>
- * Class <code>Linear[A]</code> represents linear sequences of elements.
- * For such sequences <code>isEmpty</code>, <code>head</code> and
- * <code>tail</code> are guaranteed to be efficient constant time (or near so)
- * operations.<br/>
- * It does not add any methods to <code>Seq</code> but overrides several
- * methods with optimized implementations.
- * </p>
+/** A base trait for linear sequences.
+ * $linearSeqInfo
*
* @author Martin Odersky
- * @author Matthias Zenger
- * @version 1.0, 16/07/2003
* @since 2.8
*/
trait LinearSeq[+A] extends Seq[A]
diff --git a/src/library/scala/collection/LinearSeqLike.scala b/src/library/scala/collection/LinearSeqLike.scala
index 9bed88967c..1c99d4a3d9 100644
--- a/src/library/scala/collection/LinearSeqLike.scala
+++ b/src/library/scala/collection/LinearSeqLike.scala
@@ -19,12 +19,16 @@ import scala.util.control.Breaks._
/** A template trait for linear sequences of type `LinearSeq[A]`.
*
* $linearSeqInfo
+ *
+ * This trait just implements `iterator`
+ * in terms of `isEmpty, ``head`, and `tail`.
+ * However, see `LinearSeqOptimized` for an implementation trait that overrides operations
+ * to make them run faster under the assumption of fast linear access with `head` and `tail`.
+ *
* @author Martin Odersky
- * @author Matthias Zenger
- * @version 1.0, 16/07/2003
+ * @version 2.8
* @since 2.8
*
- * @define Coll LinearSeq
* @define linearSeqInfo
* Linear sequences are defined in terms of three abstract methods, which are assumed
* to have efficient implementations. These are:
@@ -35,9 +39,8 @@ import scala.util.control.Breaks._
* }}}
* Here, `A` is the type of the sequence elements and `Repr` is the type of the sequence itself.
*
- * Linear sequences do not define any new methods wrt `Seq`. However, abstract `Seq` methods
- * are defined in terms of `isEmpty`, `head`, and `tail`, and several other methods are overridden
- * with optimized implementations.
+ * Linear sequences do not add any new methods to `Seq`, but promise efficient implementations
+ * of linear access patterns.
*
* @tparam A the element type of the $coll
* @tparam Repr the type of the actual $coll containing the elements.
@@ -47,38 +50,6 @@ trait LinearSeqLike[+A, +Repr <: LinearSeqLike[A, Repr]] extends SeqLike[A, Repr
override protected[this] def thisCollection: LinearSeq[A] = this.asInstanceOf[LinearSeq[A]]
override protected[this] def toCollection(repr: Repr): LinearSeq[A] = repr.asInstanceOf[LinearSeq[A]]
- def isEmpty: Boolean
-
- def head: A
-
- def tail: Repr
-
- /** The length of the $coll.
- *
- * $willNotTerminateInf
- *
- * Note: the execution of `length` may take time proportial to the length of the sequence.
- */
- def length: Int = {
- var these = self
- var len = 0
- while (!these.isEmpty) {
- len += 1
- these = these.tail
- }
- len
- }
-
- /** Selects an element by its index in the $coll.
- * Note: the execution of `apply` may take time proportial to the index value.
- * @throws `IndexOutOfBoundsEsxception` if `idx` does not satisfy `0 <= idx < length`.
- */
- def apply(n: Int): A = {
- val rest = drop(n)
- if (n < 0 || rest.isEmpty) throw new IndexOutOfBoundsException
- rest.head
- }
-
override /*IterableLike*/
def iterator: Iterator[A] = new Iterator[A] {
var these = self
@@ -89,239 +60,4 @@ trait LinearSeqLike[+A, +Repr <: LinearSeqLike[A, Repr]] extends SeqLike[A, Repr
} else Iterator.empty.next
override def toList: List[A] = these.toList
}
-
- override /*IterableLike*/
- def foreach[B](f: A => B) {
- var these = this
- while (!these.isEmpty) {
- f(these.head)
- these = these.tail
- }
- }
-
-
- override /*IterableLike*/
- def forall(p: A => Boolean): Boolean = {
- var these = this
- while (!these.isEmpty) {
- if (!p(these.head)) return false
- these = these.tail
- }
- true
- }
-
- override /*IterableLike*/
- def exists(p: A => Boolean): Boolean = {
- var these = this
- while (!these.isEmpty) {
- if (p(these.head)) return true
- these = these.tail
- }
- false
- }
-
- override /*TraversableLike*/
- def count(p: A => Boolean): Int = {
- var these = this
- var cnt = 0
- while (!these.isEmpty) {
- if (p(these.head)) cnt += 1
- these = these.tail
- }
- cnt
- }
-
- override /*IterableLike*/
- def find(p: A => Boolean): Option[A] = {
- var these = this
- while (!these.isEmpty) {
- if (p(these.head)) return Some(these.head)
- these = these.tail
- }
- None
- }
-/*
- override def mapFind[B](f: A => Option[B]): Option[B] = {
- var res: Option[B] = None
- var these = this
- while (res.isEmpty && !these.isEmpty) {
- res = f(these.head)
- these = these.tail
- }
- res
- }
-*/
- override /*TraversableLike*/
- def foldLeft[B](z: B)(f: (B, A) => B): B = {
- var acc = z
- var these = this
- while (!these.isEmpty) {
- acc = f(acc, these.head)
- these = these.tail
- }
- acc
- }
-
- override /*IterableLike*/
- def foldRight[B](z: B)(f: (A, B) => B): B =
- if (this.isEmpty) z
- else f(head, tail.foldRight(z)(f))
-
- override /*TraversableLike*/
- def reduceLeft[B >: A](f: (B, A) => B): B =
- if (isEmpty) throw new UnsupportedOperationException("empty.reduceLeft")
- else tail.foldLeft[B](head)(f)
-
- override /*IterableLike*/
- def reduceRight[B >: A](op: (A, B) => B): B =
- if (isEmpty) throw new UnsupportedOperationException("Nil.reduceRight")
- else if (tail.isEmpty) head
- else op(head, tail.reduceRight(op))
-
- override /*TraversableLike*/
- def last: A = {
- if (isEmpty) throw new NoSuchElementException
- var these = this
- var nx = these.tail
- while (!nx.isEmpty) {
- these = nx
- nx = nx.tail
- }
- these.head
- }
-
- override /*IterableLike*/
- def take(n: Int): Repr = {
- val b = newBuilder
- var i = 0
- var these = repr
- while (!these.isEmpty && i < n) {
- i += 1
- b += these.head
- these = these.tail
- }
- b.result
- }
-
- override /*TraversableLike*/
- def drop(n: Int): Repr = {
- var these: Repr = repr
- var count = n
- while (!these.isEmpty && count > 0) {
- these = these.tail
- count -= 1
- }
- these
- }
-
- override /*IterableLike*/
- def dropRight(n: Int): Repr = {
- val b = newBuilder
- var these = this
- var lead = this drop n
- while (!lead.isEmpty) {
- b += these.head
- these = these.tail
- lead = lead.tail
- }
- b.result
- }
-
- override /*IterableLike*/
- def slice(from: Int, until: Int): Repr = {
- val b = newBuilder
- var i = from
- var these = this drop from
- while (i < until && !these.isEmpty) {
- b += these.head
- these = these.tail
- i += 1
- }
- b.result
- }
-
- override /*IterableLike*/
- def takeWhile(p: A => Boolean): Repr = {
- val b = newBuilder
- var these = this
- while (!these.isEmpty && p(these.head)) {
- b += these.head
- these = these.tail
- }
- b.result
- }
-
- override /*TraversableLike*/
- def span(p: A => Boolean): (Repr, Repr) = {
- var these: Repr = repr
- val b = newBuilder
- while (!these.isEmpty && p(these.head)) {
- b += these.head
- these = these.tail
- }
- (b.result, these)
- }
-
- override /*IterableLike*/
- def sameElements[B >: A](that: Iterable[B]): Boolean = that match {
- case that1: LinearSeq[_] =>
- var these = this
- var those = that1
- while (!these.isEmpty && !those.isEmpty && these.head == those.head) {
- these = these.tail
- those = those.tail
- }
- these.isEmpty && those.isEmpty
- case _ =>
- super.sameElements(that)
- }
-
- override /*SeqLike*/
- def lengthCompare(len: Int): Int = {
- var i = 0
- var these = self
- while (!these.isEmpty && i <= len) {
- i += 1
- these = these.tail
- }
- i - len
- }
-
- override /*SeqLike*/
- def isDefinedAt(x: Int): Boolean = x >= 0 && lengthCompare(x) > 0
-
- override /*SeqLike*/
- def segmentLength(p: A => Boolean, from: Int): Int = {
- var i = 0
- var these = this drop from
- while (!these.isEmpty && p(these.head)) {
- i += 1
- these = these.tail
- }
- i
- }
-
- override /*SeqLike*/
- def indexWhere(p: A => Boolean, from: Int): Int = {
- var i = from
- var these = this drop from
- while (!these.isEmpty && !p(these.head)) {
- i += 1
- these = these.tail
- }
- if (these.isEmpty) -1 else i
- }
-
- override /*SeqLike*/
- def lastIndexWhere(p: A => Boolean, end: Int): Int = {
- var i = 0
- var these = this
- var last = -1
- while (!these.isEmpty && i <= end) {
- if (p(these.head)) last = i
- these = these.tail
- i += 1
- }
- last
- }
}
diff --git a/src/library/scala/collection/LinearSeqOptimized.scala b/src/library/scala/collection/LinearSeqOptimized.scala
new file mode 100755
index 0000000000..7d3c58ad85
--- /dev/null
+++ b/src/library/scala/collection/LinearSeqOptimized.scala
@@ -0,0 +1,301 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+// $Id: LinearSeqOptimized.scala 20608 2010-01-20 00:28:09Z extempore $
+
+
+package scala.collection
+import generic._
+
+import mutable.ListBuffer
+import immutable.List
+import scala.util.control.Breaks._
+
+/** A template trait for linear sequences of type `LinearSeq[A]` which optimizes
+ * the implementation of several methods under the assumption of fast linear access.
+ *
+ * $linearSeqInfo
+ * @author Martin Odersky
+ * @version 2.8
+ * @since 2.8
+ *
+ * @tparam A the element type of the $coll
+ * @tparam Repr the type of the actual $coll containing the elements.
+ */
+trait LinearSeqOptimized[+A, +Repr <: LinearSeqOptimized[A, Repr]] extends LinearSeqLike[A, Repr] { self: Repr =>
+
+ def isEmpty: Boolean
+
+ def head: A
+
+ def tail: Repr
+
+ /** The length of the $coll.
+ *
+ * $willNotTerminateInf
+ *
+ * Note: the execution of `length` may take time proportial to the length of the sequence.
+ */
+ def length: Int = {
+ var these = self
+ var len = 0
+ while (!these.isEmpty) {
+ len += 1
+ these = these.tail
+ }
+ len
+ }
+
+ /** Selects an element by its index in the $coll.
+ * Note: the execution of `apply` may take time proportial to the index value.
+ * @throws `IndexOutOfBoundsException` if `idx` does not satisfy `0 <= idx < length`.
+ */
+ def apply(n: Int): A = {
+ val rest = drop(n)
+ if (n < 0 || rest.isEmpty) throw new IndexOutOfBoundsException
+ rest.head
+ }
+
+ override /*IterableLike*/
+ def foreach[B](f: A => B) {
+ var these = this
+ while (!these.isEmpty) {
+ f(these.head)
+ these = these.tail
+ }
+ }
+
+
+ override /*IterableLike*/
+ def forall(p: A => Boolean): Boolean = {
+ var these = this
+ while (!these.isEmpty) {
+ if (!p(these.head)) return false
+ these = these.tail
+ }
+ true
+ }
+
+ override /*IterableLike*/
+ def exists(p: A => Boolean): Boolean = {
+ var these = this
+ while (!these.isEmpty) {
+ if (p(these.head)) return true
+ these = these.tail
+ }
+ false
+ }
+
+ override /*TraversableLike*/
+ def count(p: A => Boolean): Int = {
+ var these = this
+ var cnt = 0
+ while (!these.isEmpty) {
+ if (p(these.head)) cnt += 1
+ these = these.tail
+ }
+ cnt
+ }
+
+ override /*IterableLike*/
+ def find(p: A => Boolean): Option[A] = {
+ var these = this
+ while (!these.isEmpty) {
+ if (p(these.head)) return Some(these.head)
+ these = these.tail
+ }
+ None
+ }
+/*
+ override def mapFind[B](f: A => Option[B]): Option[B] = {
+ var res: Option[B] = None
+ var these = this
+ while (res.isEmpty && !these.isEmpty) {
+ res = f(these.head)
+ these = these.tail
+ }
+ res
+ }
+*/
+ override /*TraversableLike*/
+ def foldLeft[B](z: B)(f: (B, A) => B): B = {
+ var acc = z
+ var these = this
+ while (!these.isEmpty) {
+ acc = f(acc, these.head)
+ these = these.tail
+ }
+ acc
+ }
+
+ override /*IterableLike*/
+ def foldRight[B](z: B)(f: (A, B) => B): B =
+ if (this.isEmpty) z
+ else f(head, tail.foldRight(z)(f))
+
+ override /*TraversableLike*/
+ def reduceLeft[B >: A](f: (B, A) => B): B =
+ if (isEmpty) throw new UnsupportedOperationException("empty.reduceLeft")
+ else tail.foldLeft[B](head)(f)
+
+ override /*IterableLike*/
+ def reduceRight[B >: A](op: (A, B) => B): B =
+ if (isEmpty) throw new UnsupportedOperationException("Nil.reduceRight")
+ else if (tail.isEmpty) head
+ else op(head, tail.reduceRight(op))
+
+ override /*TraversableLike*/
+ def last: A = {
+ if (isEmpty) throw new NoSuchElementException
+ var these = this
+ var nx = these.tail
+ while (!nx.isEmpty) {
+ these = nx
+ nx = nx.tail
+ }
+ these.head
+ }
+
+ override /*IterableLike*/
+ def take(n: Int): Repr = {
+ val b = newBuilder
+ var i = 0
+ var these = repr
+ while (!these.isEmpty && i < n) {
+ i += 1
+ b += these.head
+ these = these.tail
+ }
+ b.result
+ }
+
+ override /*TraversableLike*/
+ def drop(n: Int): Repr = {
+ var these: Repr = repr
+ var count = n
+ while (!these.isEmpty && count > 0) {
+ these = these.tail
+ count -= 1
+ }
+ these
+ }
+
+ override /*IterableLike*/
+ def dropRight(n: Int): Repr = {
+ val b = newBuilder
+ var these = this
+ var lead = this drop n
+ while (!lead.isEmpty) {
+ b += these.head
+ these = these.tail
+ lead = lead.tail
+ }
+ b.result
+ }
+
+ override /*IterableLike*/
+ def slice(from: Int, until: Int): Repr = {
+ val b = newBuilder
+ var i = from
+ var these = this drop from
+ while (i < until && !these.isEmpty) {
+ b += these.head
+ these = these.tail
+ i += 1
+ }
+ b.result
+ }
+
+ override /*IterableLike*/
+ def takeWhile(p: A => Boolean): Repr = {
+ val b = newBuilder
+ var these = this
+ while (!these.isEmpty && p(these.head)) {
+ b += these.head
+ these = these.tail
+ }
+ b.result
+ }
+
+ override /*TraversableLike*/
+ def span(p: A => Boolean): (Repr, Repr) = {
+ var these: Repr = repr
+ val b = newBuilder
+ while (!these.isEmpty && p(these.head)) {
+ b += these.head
+ these = these.tail
+ }
+ (b.result, these)
+ }
+
+ override /*IterableLike*/
+ def sameElements[B >: A](that: Iterable[B]): Boolean = that match {
+ case that1: LinearSeq[_] =>
+ var these = this
+ var those = that1
+ while (!these.isEmpty && !those.isEmpty && these.head == those.head) {
+ these = these.tail
+ those = those.tail
+ }
+ these.isEmpty && those.isEmpty
+ case _ =>
+ super.sameElements(that)
+ }
+
+ override /*SeqLike*/
+ def lengthCompare(len: Int): Int = {
+ var i = 0
+ var these = self
+ while (!these.isEmpty && i <= len) {
+ i += 1
+ these = these.tail
+ }
+ i - len
+ }
+
+ override /*SeqLike*/
+ def isDefinedAt(x: Int): Boolean = x >= 0 && lengthCompare(x) > 0
+
+ override /*SeqLike*/
+ def segmentLength(p: A => Boolean, from: Int): Int = {
+ var i = 0
+ var these = this drop from
+ while (!these.isEmpty && p(these.head)) {
+ i += 1
+ these = these.tail
+ }
+ i
+ }
+
+ override /*SeqLike*/
+ def indexWhere(p: A => Boolean, from: Int): Int = {
+ var i = from
+ var these = this drop from
+ while (these.nonEmpty) {
+ if (p(these.head))
+ return i
+
+ i += 1
+ these = these.tail
+ }
+ -1
+ }
+
+ override /*SeqLike*/
+ def lastIndexWhere(p: A => Boolean, end: Int): Int = {
+ var i = 0
+ var these = this
+ var last = -1
+ while (!these.isEmpty && i <= end) {
+ if (p(these.head)) last = i
+ these = these.tail
+ i += 1
+ }
+ last
+ }
+}
diff --git a/src/library/scala/collection/MapLike.scala b/src/library/scala/collection/MapLike.scala
index 55cea1a678..5e1af7a2d7 100644
--- a/src/library/scala/collection/MapLike.scala
+++ b/src/library/scala/collection/MapLike.scala
@@ -12,6 +12,7 @@ package scala.collection
import generic._
import mutable.{Builder, StringBuilder, MapBuilder}
+import annotation.migration
import PartialFunction._
/** A template trait for maps of type `Map[A, B]` which associate keys of type `A`
@@ -71,7 +72,7 @@ self =>
/** Optionally returns the value associated with a key.
*
- * @key the key value
+ * @param key the key value
* @return an option value containing the value associated with `key` in this map,
* or `None` if none exists.
*/
@@ -109,7 +110,7 @@ self =>
* @param default a computation that yields a default value in case no binding for `key` is
* found in the map.
* @tparam B1 the result type of the default computation.
- * @return the value assocuated with `key` if it exists,
+ * @return the value associated with `key` if it exists,
* otherwise the result of the `default` computation.
* @usecase def getOrElse(key: A, default: => B): B
*/
@@ -181,15 +182,16 @@ self =>
*
* @return an iterator over all keys.
*/
- @deprecated("use `keysIterator' instead")
- def keys: Iterator[A] = keysIterator
+ @migration(2, 8, "As of 2.8, keys returns Iterable[A] rather than Iterator[A].")
+ def keys: Iterable[A] = keySet
/** Collects all values of this map in an iterable collection.
* @return the values of this map as an iterable.
*/
- def valuesIterable: Iterable[B] = new DefaultValuesIterable
+ @migration(2, 8, "As of 2.8, values returns Iterable[B] rather than Iterator[B].")
+ def values: Iterable[B] = new DefaultValuesIterable
- /** The implementation class of the iterable returned by `valuesIterable`.
+ /** The implementation class of the iterable returned by `values`.
*/
protected class DefaultValuesIterable extends Iterable[B] {
def iterator = valuesIterator
@@ -207,13 +209,6 @@ self =>
def next = iter.next._2
}
- /** Creates an iterator for all contained values.
- *
- * @return an iterator over all values.
- */
- @deprecated("use `valuesIterator' instead")
- def values: Iterator[B] = valuesIterator
-
/** Defines the default value computation for the map,
* returned when a key is not found
* The method implemented here throws an exception,
@@ -230,7 +225,7 @@ self =>
* @return an immutable map consisting only of those key value pairs of this map where the key satisfies
* the predicate `p`. The resulting map wraps the original map without copying any elements.
*/
- def filterKeys(p: A => Boolean) = new DefaultMap[A, B] {
+ def filterKeys(p: A => Boolean): Map[A, B] = new DefaultMap[A, B] {
override def foreach[C](f: ((A, B)) => C): Unit = for (kv <- self) if (p(kv._1)) f(kv)
def iterator = self.iterator.filter(kv => p(kv._1))
override def contains(key: A) = self.contains(key) && p(key)
@@ -245,7 +240,7 @@ self =>
/** A map view resulting from applying a given function `f` to each value
* associated with a key in this map.
*/
- def mapValues[C](f: B => C) = new DefaultMap[A, C] {
+ def mapValues[C](f: B => C): Map[A, C] = new DefaultMap[A, C] {
override def foreach[D](g: ((A, C)) => D): Unit = for ((k, v) <- self) g((k, f(v)))
def iterator = for ((k, v) <- self.iterator) yield (k, f(v))
override def size = self.size
@@ -291,18 +286,25 @@ self =>
* @return a new map with the given bindings added to this map
* @usecase def + (kvs: Traversable[(A, B)]): Map[A, B]
*/
- def ++[B1 >: B](kvs: Traversable[(A, B1)]): Map[A, B1] =
- ((repr: Map[A, B1]) /: kvs) (_ + _)
+ def ++[B1 >: B](xs: TraversableOnce[(A, B1)]): Map[A, B1] =
+ ((repr: Map[A, B1]) /: xs) (_ + _)
- /** Adds all key/value pairs produced by an iterator to this map, returning a new map.
+ /** Returns a new map with all key/value pairs for which the predicate
+ * <code>p</code> returns <code>true</code>.
*
- * @param iter the iterator producing key/value pairs
- * @tparam B1 the type of the added values
- * @return a new map with the given bindings added to this map
- * @usecase def + (iter: Iterator[(A, B)]): Map[A, B]
+ * @param p A predicate over key-value pairs
+ * @note This method works by successively removing elements fro which the
+ * predicate is false from this set.
+ * If removal is slow, or you expect that most elements of the set$
+ * will be removed, you might consider using <code>filter</code>
+ * with a negated predicate instead.
*/
- def ++[B1 >: B] (iter: Iterator[(A, B1)]): Map[A, B1] =
- ((repr: Map[A, B1]) /: iter) (_ + _)
+ override def filterNot(p: ((A, B)) => Boolean): This = {
+ var res: This = repr
+ for (kv <- this)
+ if (p(kv)) res = (res - kv._1).asInstanceOf[This] // !!! concrete overrides abstract problem
+ res
+ }
/** Appends all bindings of this map to a string builder using start, end, and separator strings.
* The written text begins with the string `start` and ends with the string
@@ -320,7 +322,7 @@ self =>
/** Defines the prefix of this object's `toString` representation.
* @return a string representation which starts the result of `toString` applied to this $coll.
- * Unless overridden in subclasse, the string prefix of every map is `"Map"`.
+ * Unless overridden in subclasses, the string prefix of every map is `"Map"`.
*/
override def stringPrefix: String = "Map"
@@ -351,7 +353,7 @@ self =>
}
} catch {
case ex: ClassCastException =>
- println("calss cast "); false
+ println("class cast "); false
}}
case _ =>
false
diff --git a/src/library/scala/collection/MapProxyLike.scala b/src/library/scala/collection/MapProxyLike.scala
index 427eaa6e2c..f269a368dd 100644
--- a/src/library/scala/collection/MapProxyLike.scala
+++ b/src/library/scala/collection/MapProxyLike.scala
@@ -36,10 +36,9 @@ trait MapProxyLike[A, +B, +This <: MapLike[A, B, This] with Map[A, B]]
override def isDefinedAt(key: A) = self.isDefinedAt(key)
override def keySet: Set[A] = self.keySet
override def keysIterator: Iterator[A] = self.keysIterator
- override def keys: Iterator[A] = self.keysIterator
- override def valuesIterable: Iterable[B] = self.valuesIterable
+ override def keys: Iterable[A] = self.keys
+ override def values: Iterable[B] = self.values
override def valuesIterator: Iterator[B] = self.valuesIterator
- override def values: Iterator[B] = self.valuesIterator
override def default(key: A): B = self.default(key)
override def filterKeys(p: A => Boolean) = self.filterKeys(p)
override def mapValues[C](f: B => C) = self.mapValues(f)
diff --git a/src/library/scala/collection/SeqLike.scala b/src/library/scala/collection/SeqLike.scala
index 32aae28851..0db64926a6 100644
--- a/src/library/scala/collection/SeqLike.scala
+++ b/src/library/scala/collection/SeqLike.scala
@@ -11,7 +11,7 @@
package scala.collection
-import mutable.{ListBuffer, HashMap, GenericArray}
+import mutable.{ListBuffer, HashMap, ArraySeq}
import immutable.{List, Range}
import generic._
@@ -169,7 +169,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
*
* @param idx The index to select.
* @return the element of this $coll at index `idx`, where `0` indicates the first element.
- * @throws `IndexOutOfBoundsEsxception` if `idx` does not satisfy `0 <= idx < length`.
+ * @throws `IndexOutOfBoundsException` if `idx` does not satisfy `0 <= idx < length`.
*/
def apply(idx: Int): A
@@ -212,7 +212,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
*/
def isDefinedAt(idx: Int): Boolean = (idx >= 0) && (idx < length)
- /** Computes length of longest segment whose elements all satisfy some preficate.
+ /** Computes length of longest segment whose elements all satisfy some predicate.
*
* $mayNotTerminateInf
*
@@ -229,7 +229,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
i
}
- /** Returns the length of the longest prefix whose elements all satisfy some preficate.
+ /** Returns the length of the longest prefix whose elements all satisfy some predicate.
*
* $mayNotTerminateInf
*
@@ -261,12 +261,15 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
def indexWhere(p: A => Boolean, from: Int): Int = {
var i = from
var it = iterator.drop(from)
- while (it.hasNext && !p(it.next()))
- i += 1
- if (it.hasNext) i else -1
+ while (it.hasNext) {
+ if (p(it.next())) return i
+ else i += 1
+ }
+
+ -1
}
- /** Returns index of the first element satisying a predicate, or `-1`.
+ /** Returns index of the first element satisfying a predicate, or `-1`.
*/
def findIndexOf(p: A => Boolean): Int = indexWhere(p)
@@ -491,7 +494,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
/** Finds last index before or at a given end index where this $coll contains a given sequence as a slice.
* @param that the sequence to test
- * @param end the end idnex
+ * @param end the end index
* @return the last index `<= end` such that the elements of this $coll starting at this index
* match the elements of sequence `that`, or `-1` of no such subsequence exists.
*/
@@ -557,7 +560,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
* ''n'' times in `that`, then the first ''n'' occurrences of `x` will not form
* part of the result, but any following occurrences will.
*/
- def diff[B >: A, That](that: Seq[B]): Repr = {
+ def diff[B >: A](that: Seq[B]): Repr = {
val occ = occCounts(that)
val b = newBuilder
for (x <- this)
@@ -585,7 +588,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
* ''n'' times in `that`, then the first ''n'' occurrences of `x` will be retained
* in the result, but any following occurrences will be omitted.
*/
- def intersect[B >: A, That](that: Seq[B]): Repr = {
+ def intersect[B >: A](that: Seq[B]): Repr = {
val occ = occCounts(that)
val b = newBuilder
for (x <- this)
@@ -607,7 +610,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
*
* @return A new $coll which contains the first occurrence of every element of this $coll.
*/
- def removeDuplicates: Repr = {
+ def distinct: Repr = {
val b = newBuilder
var seen = Set[A]() //TR: should use mutable.HashSet?
for (x <- this) {
@@ -627,7 +630,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
* @tparam B the element type of the returned $coll.
* @tparam That $thatinfo
* @param bf $bfinfo
- * @return a new collection of type `That` consisting of all elements of this $coll
+ * @return a new $coll consisting of all elements of this $coll
* except that `replaced` elements starting from `from` are replaced
* by `patch`.
* @usecase def patch(from: Int, that: Seq[A], replaced: Int): $Coll[A]
@@ -650,7 +653,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
* @tparam B the element type of the returned $coll.
* @tparam That $thatinfo
* @param bf $bfinfo
- * @return a new collection of type `That` which is a copy of this $coll with the element at position `index` replaced by `elem`.
+ * @return a new $coll` which is a copy of this $coll with the element at position `index` replaced by `elem`.
* @usecase def updated(index: Int, elem: A): $Coll[A]
* @return a copy of this $coll with the element at position `index` replaced by `elem`.
*/
@@ -721,7 +724,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
b ++= thisCollection
while (diff > 0) {
b += elem
- diff -=1
+ diff -= 1
}
b.result
}
@@ -757,12 +760,34 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
* the desired ordering.
* @return a $coll consisting of the elements of this $coll
* sorted according to the comparison function `lt`.
- * @ex {{{
+ * @example {{{
* List("Steve", "Tom", "John", "Bob").sortWith(_.compareTo(_) < 0) =
* List("Bob", "John", "Steve", "Tom")
* }}}
*/
- def sortWith(lt: (A, A) => Boolean): Repr = sortWith(Ordering fromLessThan lt)
+ def sortWith(lt: (A, A) => Boolean): Repr = sorted(Ordering fromLessThan lt)
+
+ /** Sorts this $Coll according to the Ordering which results from transforming
+ * an implicitly given Ordering with a transformation function.
+ * @see scala.math.Ordering
+ * $willNotTerminateInf
+ * @param f the transformation function mapping elements
+ * to some other domain `B`.
+ * @param ord the ordering assumed on domain `B`.
+ * @tparam B the target type of the transformation `f`, and the type where
+ * the ordering `ord` is defined.
+ * @return a $coll consisting of the elements of this $coll
+ * sorted according to the ordering where `x < y` if
+ * `ord.lt(f(x), f(y))`.
+ *
+ * @example {{{
+ * val words = "The quick brown fox jumped over the lazy dog".split(' ')
+ * // this works because scala.Ordering will implicitly provide an Ordering[Tuple2[Int, Char]]
+ * words.sortBy(x => (x.length, x.head))
+ * res0: Array[String] = Array(The, dog, fox, the, lazy, over, brown, quick, jumped)
+ * }}}
+ */
+ def sortBy[B](f: A => B)(implicit ord: Ordering[B]): Repr = sorted(ord on f)
/** Sorts this $coll according to an Ordering.
*
@@ -775,42 +800,19 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
* @return a $coll consisting of the elements of this $coll
* sorted according to the ordering `ord`.
*/
- def sortWith[B >: A](ord: Ordering[B]): Repr = {
- val arr = new GenericArray[A](this.length)
+ def sorted[B >: A](implicit ord: Ordering[B]): Repr = {
+ val arr = new ArraySeq[A](this.length)
var i = 0
for (x <- this) {
arr(i) = x
i += 1
}
- java.util.Arrays.sort(
- arr.array, ord.asInstanceOf[Ordering[Object]])
+ java.util.Arrays.sort(arr.array, ord.asInstanceOf[Ordering[Object]])
val b = newBuilder
for (x <- arr) b += x
b.result
}
- /** Sorts this $Coll according to the Ordering which results from transforming
- * an implicitly given Ordering with a transformation function.
- * @see scala.math.Ordering
- * $willNotTerminateInf
- * @param f the transformation function mapping elements
- * to some other domain `B`.
- * @param ord the ordering assumed on domain `B`.
- * @tparam B the target type of the transformation `f`, and the type where
- * the ordering `ord` is defined.
- * @return a $coll consisting of the elements of this $coll
- * sorted according to the ordering where `x < y` if
- * `ord.lt(f(x), f(y))`.
- *
- * @ex {{{
- * val words = "The quick brown fox jumped over the lazy dog".split(' ')
- * // this works because scala.Ordering will implicitly provide an Ordering[Tuple2[Int, Char]]
- * words.sortBy(x => (x.length, x.head))
- * res0: Array[String] = Array(The, dog, fox, the, lazy, over, brown, quick, jumped)
- * }}}
- */
- def sortBy[B](f: A => B)(implicit ord: Ordering[B]): Repr = sortWith(ord on f)
-
/** Converts this $coll to a sequence.
* $willNotTerminateInf
*
@@ -820,7 +822,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
/** Produces the range of all indices of this sequence.
*
- * @range a `Range` value from `0` to one less than the length of this $coll.
+ * @return a `Range` value from `0` to one less than the length of this $coll.
*/
def indices: Range = 0 until length
@@ -839,16 +841,17 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
override def hashCode() = (Seq.hashSeed /: this)(_ * 41 + _.hashCode)
override def equals(that: Any): Boolean = that match {
- case that: Seq[_] => (that canEqual this) && (this sameElements that)
- case _ => false
+ case that: Seq[_] => (that canEqual this) && (this sameElements that)
+ case _ => false
}
/* Need to override string, so that it's not the Function1's string that gets mixed in.
*/
override def toString = super[IterableLike].toString
- /** Returns index of the last element satisying a predicate, or -1.
+ /** Returns index of the last element satisfying a predicate, or -1.
*/
+ @deprecated("use `lastIndexWhere` instead")
def findLastIndexOf(p: A => Boolean): Int = lastIndexWhere(p)
/** Tests whether every element of this $coll relates to the
@@ -862,15 +865,7 @@ trait SeqLike[+A, +Repr] extends IterableLike[A, Repr] { self =>
* and `y` of `that`, otherwise `false`.
*/
@deprecated("use `corresponds` instead")
- def equalsWith[B](that: Seq[B])(f: (A,B) => Boolean): Boolean = {
- val i = this.iterator
- val j = that.iterator
- while (i.hasNext && j.hasNext)
- if (!f(i.next, j.next))
- return false
-
- !i.hasNext && !j.hasNext
- }
+ def equalsWith[B](that: Seq[B])(f: (A,B) => Boolean): Boolean = corresponds(that)(f)
/**
* returns a projection that can be used to call non-strict <code>filter</code>,
diff --git a/src/library/scala/collection/SeqProxyLike.scala b/src/library/scala/collection/SeqProxyLike.scala
index 3dfac63dde..24ee0b430a 100644
--- a/src/library/scala/collection/SeqProxyLike.scala
+++ b/src/library/scala/collection/SeqProxyLike.scala
@@ -23,11 +23,10 @@ import generic._
* @version 2.8
* @since 2.8
*/
-trait SeqProxyLike[+A, +This <: SeqLike[A, This] with Seq[A]] extends SeqLike[A, This] with IterableProxyLike[A, This] {
+trait SeqProxyLike[+A, +Repr <: SeqLike[A, Repr] with Seq[A]] extends SeqLike[A, Repr] with IterableProxyLike[A, Repr] {
override def length: Int = self.length
override def apply(idx: Int): A = self.apply(idx)
override def lengthCompare(len: Int): Int = self.lengthCompare(len)
- override def size = self.size
override def isDefinedAt(x: Int): Boolean = self.isDefinedAt(x)
override def segmentLength(p: A => Boolean, from: Int): Int = self.segmentLength(p, from)
override def prefixLength(p: A => Boolean) = self.prefixLength(p)
@@ -40,22 +39,34 @@ trait SeqProxyLike[+A, +This <: SeqLike[A, This] with Seq[A]] extends SeqLike[A,
override def lastIndexOf[B >: A](elem: B, end: Int): Int = self.lastIndexWhere(elem ==, end)
override def lastIndexWhere(p: A => Boolean): Int = self.lastIndexWhere(p, length - 1)
override def lastIndexWhere(p: A => Boolean, end: Int): Int = self.lastIndexWhere(p)
- override def reverse: This = self.reverse
+ override def reverse: Repr = self.reverse
+ override def reverseMap[B, That](f: A => B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.reverseMap(f)(bf)
override def reverseIterator: Iterator[A] = self.reverseIterator
override def startsWith[B](that: Seq[B], offset: Int): Boolean = self.startsWith(that, offset)
override def startsWith[B](that: Seq[B]): Boolean = self.startsWith(that)
override def endsWith[B](that: Seq[B]): Boolean = self.endsWith(that)
override def indexOfSlice[B >: A](that: Seq[B]): Int = self.indexOfSlice(that)
+ override def indexOfSlice[B >: A](that: Seq[B], from: Int): Int = self.indexOfSlice(that)
+ override def lastIndexOfSlice[B >: A](that: Seq[B]): Int = self.lastIndexOfSlice(that)
+ override def lastIndexOfSlice[B >: A](that: Seq[B], end: Int): Int = self.lastIndexOfSlice(that, end)
+ override def containsSlice[B](that: Seq[B]): Boolean = self.indexOfSlice(that) != -1
override def contains(elem: Any): Boolean = self.contains(elem)
- override def union[B >: A, That](that: Seq[B])(implicit bf: CanBuildFrom[This, B, That]): That = self.union(that)(bf)
- override def diff[B >: A, That](that: Seq[B]): This = self.diff(that)
- override def intersect[B >: A, That](that: Seq[B]): This = self.intersect(that)
- override def removeDuplicates: This = self.removeDuplicates
- override def patch[B >: A, That](from: Int, patch: Seq[B], replaced: Int)(implicit bf: CanBuildFrom[This, B, That]): That = self.patch(from, patch, replaced)(bf)
- override def padTo[B >: A, That](len: Int, elem: B)(implicit bf: CanBuildFrom[This, B, That]): That = self.padTo(len, elem)(bf)
+ override def union[B >: A, That](that: Seq[B])(implicit bf: CanBuildFrom[Repr, B, That]): That = self.union(that)(bf)
+ override def diff[B >: A](that: Seq[B]): Repr = self.diff(that)
+ override def intersect[B >: A](that: Seq[B]): Repr = self.intersect(that)
+ override def distinct: Repr = self.distinct
+ override def patch[B >: A, That](from: Int, patch: Seq[B], replaced: Int)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.patch(from, patch, replaced)(bf)
+ override def updated[B >: A, That](index: Int, elem: B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.updated(index, elem)(bf)
+ override def +:[B >: A, That](elem: B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.+:(elem)(bf)
+ override def :+[B >: A, That](elem: B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.:+(elem)(bf)
+ override def padTo[B >: A, That](len: Int, elem: B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.padTo(len, elem)(bf)
+ override def corresponds[B](that: Seq[B])(p: (A,B) => Boolean): Boolean = self.corresponds(that)(p)
+ override def sortWith(lt: (A, A) => Boolean): Repr = self.sortWith(lt)
+ override def sortBy[B](f: A => B)(implicit ord: Ordering[B]): Repr = self.sortBy(f)(ord)
+ override def sorted[B >: A](implicit ord: Ordering[B]): Repr = self.sorted(ord)
override def indices: Range = self.indices
override def view = self.view
override def view(from: Int, until: Int) = self.view(from, until)
- override def equalsWith[B](that: Seq[B])(f: (A,B) => Boolean): Boolean = (self zip that) forall { case (x,y) => f(x,y) }
- override def containsSlice[B](that: Seq[B]): Boolean = self.indexOfSlice(that) != -1
}
+
+
diff --git a/src/library/scala/collection/SeqView.scala b/src/library/scala/collection/SeqView.scala
index 61443b3b90..79d50f1de6 100644
--- a/src/library/scala/collection/SeqView.scala
+++ b/src/library/scala/collection/SeqView.scala
@@ -21,6 +21,8 @@ import TraversableView.NoBuilder
*/
trait SeqView[+A, +Coll] extends SeqViewLike[A, Coll, SeqView[A, Coll]]
+/** $factoryInfo
+ */
object SeqView {
type Coll = TraversableView[_, C] forSome {type C <: Traversable[_]}
implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, SeqView[A, Seq[_]]] =
diff --git a/src/library/scala/collection/SeqViewLike.scala b/src/library/scala/collection/SeqViewLike.scala
index 1a8cd20013..7014833a46 100644
--- a/src/library/scala/collection/SeqViewLike.scala
+++ b/src/library/scala/collection/SeqViewLike.scala
@@ -21,8 +21,8 @@ import TraversableView.NoBuilder
* @version 2.8
*/
trait SeqViewLike[+A,
- +Coll,
- +This <: SeqView[A, Coll] with SeqViewLike[A, Coll, This]]
+ +Coll,
+ +This <: SeqView[A, Coll] with SeqViewLike[A, Coll, This]]
extends Seq[A] with SeqLike[A, This] with IterableView[A, Coll] with IterableViewLike[A, Coll, This]
{ self =>
@@ -31,6 +31,11 @@ trait SeqViewLike[+A,
override def apply(idx: Int): B
}
+ trait Forced[B] extends Transformed[B] with super.Forced[B] {
+ override def length = forced.length
+ override def apply(idx: Int) = forced.apply(idx)
+ }
+
trait Sliced extends Transformed[A] with super.Sliced {
override def length = ((until min self.length) - from) max 0
override def apply(idx: Int): A =
@@ -104,7 +109,8 @@ trait SeqViewLike[+A,
trait Zipped[B] extends Transformed[(A, B)] with super.Zipped[B] {
protected[this] lazy val thatSeq = other.toSeq
- override def length: Int = self.length min thatSeq.length
+ /* Have to be careful here - other may be an infinite sequence. */
+ override def length = if ((thatSeq lengthCompare self.length) <= 0) thatSeq.length else self.length
override def apply(idx: Int) = (self.apply(idx), thatSeq.apply(idx))
}
@@ -143,9 +149,20 @@ trait SeqViewLike[+A,
override def stringPrefix = self.stringPrefix+"P"
}
+ trait Prepended[B >: A] extends Transformed[B] {
+ protected[this] val fst: B
+ override def iterator: Iterator[B] = Iterator.single(fst) ++ self.iterator
+ override def length: Int = 1 + self.length
+ override def apply(idx: Int): B =
+ if (idx == 0) fst
+ else self.apply(idx - 1)
+ override def stringPrefix = self.stringPrefix+"A"
+ }
+
/** Boilerplate method, to override in each subclass
* This method could be eliminated if Scala had virtual classes
*/
+ protected override def newForced[B](xs: => Seq[B]): Transformed[B] = new Forced[B] { val forced = xs }
protected override def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
protected override def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
@@ -157,6 +174,7 @@ trait SeqViewLike[+A,
protected override def newZippedAll[A1 >: A, B](that: Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] = new ZippedAll[A1, B] { val other = that; val thisElem = _thisElem; val thatElem = _thatElem }
protected def newReversed: Transformed[A] = new Reversed { }
protected def newPatched[B >: A](_from: Int, _patch: Seq[B], _replaced: Int): Transformed[B] = new Patched[B] { val from = _from; val patch = _patch; val replaced = _replaced }
+ protected def newPrepended[B >: A](elem: B): Transformed[B] = new Prepended[B] { protected[this] val fst = elem }
override def reverse: This = newReversed.asInstanceOf[This]
@@ -167,14 +185,35 @@ trait SeqViewLike[+A,
// else super.patch[B, That](from, patch, replaced)(bf)
}
- //TR TODO: updated, +: ed :+ ed
-
override def padTo[B >: A, That](len: Int, elem: B)(implicit bf: CanBuildFrom[This, B, That]): That =
patch(length, fill(len - length)(elem), 0)
override def reverseMap[B, That](f: A => B)(implicit bf: CanBuildFrom[This, B, That]): That =
reverse.map(f)
+ override def updated[B >: A, That](index: Int, elem: B)(implicit bf: CanBuildFrom[This, B, That]): That = {
+ require(0 <= index && index < length)
+ patch(index, List(elem), 1)(bf)
+ }
+
+ override def +:[B >: A, That](elem: B)(implicit bf: CanBuildFrom[This, B, That]): That =
+ newPrepended(elem).asInstanceOf[That]
+
+ override def :+[B >: A, That](elem: B)(implicit bf: CanBuildFrom[This, B, That]): That =
+ ++(Iterator.single(elem))(bf)
+
+ override def union[B >: A, That](that: Seq[B])(implicit bf: CanBuildFrom[This, B, That]): That =
+ newForced(thisSeq union that).asInstanceOf[That]
+
+ override def diff[B >: A](that: Seq[B]): This =
+ newForced(thisSeq diff that).asInstanceOf[This]
+
+ override def intersect[B >: A](that: Seq[B]): This =
+ newForced(thisSeq intersect that).asInstanceOf[This]
+
+ override def sorted[B >: A](implicit ord: Ordering[B]): This =
+ newForced(thisSeq sorted ord).asInstanceOf[This]
+
override def stringPrefix = "SeqView"
}
diff --git a/src/library/scala/collection/SetLike.scala b/src/library/scala/collection/SetLike.scala
index 156d0d8b2b..48b5358afc 100644
--- a/src/library/scala/collection/SetLike.scala
+++ b/src/library/scala/collection/SetLike.scala
@@ -187,7 +187,7 @@ self =>
* @note This operation contains an unchecked cast: if `that`
* is a set, it will assume with an unchecked cast
* that it has the same element type as this set.
- * Any subsequuent ClassCastException is treated as a `false` result.
+ * Any subsequent ClassCastException is treated as a `false` result.
*/
override def equals(that: Any): Boolean = that match {
case that: Set[_] =>
diff --git a/src/library/scala/collection/SortedMap.scala b/src/library/scala/collection/SortedMap.scala
index 24f363243f..7b0d35220e 100644
--- a/src/library/scala/collection/SortedMap.scala
+++ b/src/library/scala/collection/SortedMap.scala
@@ -21,9 +21,9 @@ import mutable.Builder
*/
trait SortedMap[A, +B] extends Map[A, B] with SortedMapLike[A, B, SortedMap[A, B]] {
/** Needs to be overridden in subclasses. */
- override def empty = SortedMap.empty[A, B]
+ override def empty: SortedMap[A, B] = SortedMap.empty[A, B]
- override protected[this] def newBuilder : Builder[(A, B), SortedMap[A, B]] =
+ override protected[this] def newBuilder: Builder[(A, B), SortedMap[A, B]] =
immutable.SortedMap.newBuilder[A, B]
}
diff --git a/src/library/scala/collection/Traversable.scala b/src/library/scala/collection/Traversable.scala
index c3d7fa8bc7..1cecec5227 100644
--- a/src/library/scala/collection/Traversable.scala
+++ b/src/library/scala/collection/Traversable.scala
@@ -29,8 +29,7 @@ trait Traversable[+A] extends TraversableLike[A, Traversable[A]]
override def isEmpty: Boolean
override def size: Int
override def hasDefiniteSize
- override def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[Traversable[A], B, That]): That
- override def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[Traversable[A], B, That]): That
+ override def ++[B >: A, That](xs: TraversableOnce[B])(implicit bf: CanBuildFrom[Traversable[A], B, That]): That
override def map[B, That](f: A => B)(implicit bf: CanBuildFrom[Traversable[A], B, That]): That
override def flatMap[B, That](f: A => Traversable[B])(implicit bf: CanBuildFrom[Traversable[A], B, That]): That
override def filter(p: A => Boolean): Traversable[A]
diff --git a/src/library/scala/collection/TraversableLike.scala b/src/library/scala/collection/TraversableLike.scala
index fc666ddb92..7008d3b5fd 100644
--- a/src/library/scala/collection/TraversableLike.scala
+++ b/src/library/scala/collection/TraversableLike.scala
@@ -88,8 +88,8 @@ import immutable.{List, Stream, Nil, ::}
*
* Note: will not terminate for infinite-sized collections.
*/
-trait TraversableLike[+A, +Repr] extends HasNewBuilder[A, Repr] {
-self =>
+trait TraversableLike[+A, +Repr] extends HasNewBuilder[A, Repr] with TraversableOnce[A] {
+ self =>
import Traversable.breaks._
@@ -148,23 +148,10 @@ self =>
result
}
- /** Tests whether the $coll is not empty.
- *
- * @return `true` if the $coll contains at least one element, `false` otherwise.
+ /** Tests whether this $coll can be repeatedly traversed.
+ * @return `true`
*/
- def nonEmpty: Boolean = !isEmpty
-
- /** The size of this $coll.
- *
- * $willNotTerminateInf
- *
- * @return the number of elements in this $coll.
- */
- def size: Int = {
- var result = 0
- for (x <- this) result += 1
- result
- }
+ final def isTraversableAgain = true
/** Tests whether this $coll is known to have a finite size.
* All strict collections are known to have finite size. For a non-strict collection
@@ -186,36 +173,15 @@ self =>
* @return a new collection of type `That` which contains all elements of this $coll
* followed by all elements of `that`.
*
- * @usecase def ++(that: Traversable[A]): $Coll[A]
- *
- * @return a new $coll which contains all elements of this $coll
- * followed by all elements of `that`.
- */
- def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[Repr, B, That]): That = {
- val b = bf(repr)
- b ++= thisCollection
- b ++= that
- b.result
- }
-
- /** Concatenates this $coll with the elements of an iterator.
- *
- * @param that the iterator to append.
- * @tparam B the element type of the returned collection.
- * @tparam That $thatinfo
- * @param bf $bfinfo
- * @return a new collection of type `That` which contains all elements of this $coll
- * followed by all elements of `that`.
- *
- * @usecase def ++(that: Iterator[A]): $Coll[A]
+ * @usecase def ++(that: TraversableOnce[A]): $Coll[A]
*
* @return a new $coll which contains all elements of this $coll
* followed by all elements of `that`.
*/
- def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[Repr, B, That]): That = {
+ def ++[B >: A, That](xs: TraversableOnce[B])(implicit bf: CanBuildFrom[Repr, B, That]): That = {
val b = bf(repr)
b ++= thisCollection
- b ++= that
+ b ++= xs
b.result
}
@@ -292,13 +258,13 @@ self =>
* `pf` to each element on which it is defined and collecting the results.
* The order of the elements is preserved.
*
- * @usecase def partialMap[B](pf: PartialFunction[A, B]): $Coll[B]
+ * @usecase def collect[B](pf: PartialFunction[A, B]): $Coll[B]
*
* @return a new $coll resulting from applying the given partial function
* `pf` to each element on which it is defined and collecting the results.
* The order of the elements is preserved.
*/
- def partialMap[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[Repr, B, That]): That = {
+ def collect[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[Repr, B, That]): That = {
val b = bf(repr)
for (x <- this) if (pf.isDefinedAt(x)) b += pf(x)
b.result
@@ -409,21 +375,6 @@ self =>
result
}
- /** Counts the number of elements in the $coll which satisfy a predicate.
- *
- * @param p the predicate used to test elements.
- * @return the number of elements satisfying the predicate `p`.
- *
- *
- */
- def count(p: A => Boolean): Int = {
- var cnt = 0
- for (x <- this) {
- if (p(x)) cnt += 1
- }
- cnt
- }
-
/** Finds the first element of the $coll satisfying a predicate, if any.
*
* $mayNotTerminateInf
@@ -464,227 +415,44 @@ self =>
}
*/
- /** Applies a binary operator to a start value and all elements of this $coll, going left to right.
- *
- * $willNotTerminateInf
- * $orderDependentFold
- *
- * @param z the start value.
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive elements of this $coll$,
- * going left to right with the start value `z` on the left:
- * {{{
- * op(...op(z, x,,1,,), x,,2,,, ..., x,,n,,)
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
- */
- def foldLeft[B](z: B)(op: (B, A) => B): B = {
- var result = z
- for (x <- this)
- result = op(result, x)
- result
- }
-
- /** Applies a binary operator to a start value and all elements of this $coll, going left to right.
- *
- * Note: `/:` is alternate syntax for `foldLeft`; `z /: xs` is the same as `xs foldLeft z`.
- * $willNotTerminateInf
- * $orderDependentFold
- *
- * @param z the start value.
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive elements of this $coll$,
- * going left to right with the start value `z` on the left:
- * {{{
- * op(...op(op(z, x,,1,,), x,,2,,), ..., x,,n,,)
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
- */
- def /: [B](z: B)(op: (B, A) => B): B = foldLeft(z)(op)
-
- /** Applies a binary operator to all elements of this $coll and a start value, going right to left.
- *
- * $willNotTerminateInf
- * $orderDependentFold
- * @param z the start value.
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive elements of this $coll$,
- * going right to left with the start value `z` on the right:
- * {{{
- * op(x,,1,,, op(x,,2,,, ... op(x,,n,,, z)...))
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
- */
- def foldRight[B](z: B)(op: (A, B) => B): B = {
- var elems: List[A] = Nil
- for (x <- this) elems = x :: elems
- elems.foldLeft(z)((x, y) => op(y, x))
- }
-
- /** Applies a binary operator to all elements of this $coll and a start value, going right to left.
- *
- * Note: `:\` is alternate syntax for `foldRight`; `xs :\ z` is the same as `xs foldRight z`.
- * $willNotTerminateInf
- * $orderDependentFold
- *
- * @param z the start value
- * @param op the binary operator
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive elements of this $coll$,
- * going right to left with the start value `z` on the right:
- * {{{
- * op(x,,1,,, op(x,,2,,, ... op(x,,n,,, z)...))
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
- */
- def :\ [B](z: B)(op: (A, B) => B): B = foldRight(z)(op)
-
- /** Applies a binary operator to all elements of this $coll, going left to right.
- * $willNotTerminateInf
- * $orderDependentFold
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive elements of this $coll$,
- * going left to right:
- * {{{
- * op(...(op(x,,1,,, x,,2,,), ... ) , x,,n,,)
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
- * @throws `UnsupportedOperationException` if this $coll is empty.
- */
- def reduceLeft[B >: A](op: (B, A) => B): B = {
- if (isEmpty) throw new UnsupportedOperationException("empty.reduceLeft")
- var result: B = head
- var first = true
- for (x <- this)
- if (first) first = false
- else result = op(result, x)
- result
- }
-
- /** Optionally applies a binary operator to all elements of this $coll, going left to right.
- * $willNotTerminateInf
- * $orderDependentFold
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return an option value containing the result of `reduceLeft(op)` is this $coll is nonempty,
- * `None` otherwise.
- */
- def reduceLeftOption[B >: A](op: (B, A) => B): Option[B] = {
- if (isEmpty) None else Some(reduceLeft(op))
- }
-
- /** Applies a binary operator to all elements of this $coll, going right to left.
- * $willNotTerminateInf
- * $orderDependentFold
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return the result of inserting `op` between consecutive elements of this $coll$,
- * going right to left:
- * {{{
- * op(x,,1,,, op(x,,2,,, ..., op(x,,n-1,,, x,,n,,)...))
- * }}}
- * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
- * @throws `UnsupportedOperationException` if this $coll is empty.
- */
- def reduceRight[B >: A](op: (A, B) => B): B = {
- if (isEmpty) throw new UnsupportedOperationException("empty.reduceRight")
- var elems: List[A] = Nil
- for (x <- this) elems = x :: elems
- elems.reduceLeft[B]((x, y) => op(y, x))
- }
-
- /** Optionally applies a binary operator to all elements of this $coll, going right to left.
- * $willNotTerminateInf
- * $orderDependentFold
- *
- * @param op the binary operator.
- * @tparam B the result type of the binary operator.
- * @return an option value containing the result of `reduceRight(op)` is this $coll is nonempty,
- * `None` otherwise.
- */
- def reduceRightOption[B >: A](op: (A, B) => B): Option[B] =
- if (isEmpty) None else Some(reduceRight(op))
-
- /** Sums up the elements of this collection.
- *
- * @param num an implicit parameter defining a set of numeric operations
- * which includes the `+` operator to be used in forming the sum.
- * @tparam B the result type of the `+` operator.
- * @return the sum of all elements of this $coll with respect to the `+` operator in `num`.
- *
- * @usecase def sum: Int
- *
- * @return the sum of all elements in this $coll of numbers of type `Int`.
- * Instead of `Int`, any other type `T` with an implicit `Numeric[T]` implementation
- * can be used as element type of the $coll and as result type of `sum`.
- * Examples of such types are: `Long`, `Float`, `Double`, `BigInt`.
- *
- */
- def sum[B >: A](implicit num: Numeric[B]): B = {
- var acc = num.zero
- for (x <- self) acc = num.plus(acc, x)
- acc
- }
-
- /** Multiplies up the elements of this collection.
- *
- * @param num an implicit parameter defining a set of numeric operations
- * which includes the `*` operator to be used in forming the product.
- * @tparam B the result type of the `*` operator.
- * @return the product of all elements of this $coll with respect to the `*` operator in `num`.
- *
- * @usecase def product: Int
- *
- * @return the product of all elements in this $coll of numbers of type `Int`.
- * Instead of `Int`, any other type `T` with an implicit `Numeric[T]` implementation
- * can be used as element type of the $coll and as result type of `product`.
- * Examples of such types are: `Long`, `Float`, `Double`, `BigInt`.
- */
- def product[B >: A](implicit num: Numeric[B]): B = {
- var acc = num.one
- for (x <- self) acc = num.times(acc, x)
- acc
- }
-
- /** Finds the smallest element.
- *
- * @param cmp An ordering to be used for comparing elements.
- * @tparam B The type over which the ordering is defined.
- * @return the smallest element of this $coll with respect to the ordering `cmp`.
+ /**
+ * Produces a collection containing cummulative results of applying the operator going left to right.
+ * $willNotTerminateInf
+ * $orderDependent
*
- * @usecase def min: A
- * @return the smallest element of this $coll
+ * @tparam B the type of the elements in the resulting collection
+ * @tparam That the actual type of the resulting collection
+ * @param z the initial value
+ * @param op the binary operator applied to the intermediate result and the element
+ * @param bf $bfinfo
+ * @return collection with intermediate results
*/
- def min[B >: A](implicit cmp: Ordering[B]): A = {
- if (isEmpty) throw new UnsupportedOperationException("empty.min")
- var acc = self.head
- for (x <- self)
- if (cmp.lt(x, acc)) acc = x
- acc
+ def scanLeft[B, That](z: B)(op: (B, A) => B)(implicit bf: CanBuildFrom[Repr, B, That]): That = {
+ val b = bf(repr)
+ var acc = z
+ b += acc
+ for (x <- this) { acc = op(acc, x); b += acc }
+ b.result
}
- /** Finds the largest element.
- *
- * @param cmp An ordering to be used for comparing elements.
- * @tparam B The type over which the ordering is defined.
- * @return the largest element of this $coll with respect to the ordering `cmp`.
+ /**
+ * Produces a collection containing cummulative results of applying the operator going right to left.
+ * $willNotTerminateInf
+ * $orderDependent
*
- * @usecase def min: A
- * @return the largest element of this $coll.
+ * @tparam B the type of the elements in the resulting collection
+ * @tparam That the actual type of the resulting collection
+ * @param z the initial value
+ * @param op the binary operator applied to the intermediate result and the element
+ * @param bf $bfinfo
+ * @return collection with intermediate results
*/
- def max[B >: A](implicit cmp: Ordering[B]): A = {
- if (isEmpty) throw new UnsupportedOperationException("empty.max")
- var acc = self.head
- for (x <- self)
- if (cmp.gt(x, acc)) acc = x
- acc
+ def scanRight[B, That](z: B)(op: (A, B) => B)(implicit bf: CanBuildFrom[Repr, B, That]): That = {
+ val b = bf(repr)
+ var acc = z
+ b += acc
+ for (x <- reversed) { acc = op(x, acc); b += acc }
+ b.result
}
/** Selects the first element of this $coll.
@@ -849,7 +617,7 @@ self =>
b.result
}
- /** Spits this $coll into a prefix/suffix pair according to a predicate.
+ /** Splits this $coll into a prefix/suffix pair according to a predicate.
*
* Note: `c span p` is equivalent to (but possibly more efficient than)
* `(c takeWhile p, c dropWhile p)`, provided the evaluation of the predicate `p`
@@ -889,14 +657,6 @@ self =>
(l.result, r.result)
}
- /** Copies all elements of this $coll to a buffer.
- * $willNotTerminateInf
- * @param dest The buffer to which elements are copied.
- */
- def copyToBuffer[B >: A](dest: Buffer[B]) {
- for (x <- this) dest += x
- }
-
/** Copies elements of this $coll to an array.
* Fills the given array `xs` with at most `len` elements of
* this $coll, starting at position `start`.
@@ -925,188 +685,11 @@ self =>
}
}
- /** Copies elements of this $coll to an array.
- * Fills the given array `xs` with all elements of
- * this $coll, starting at position `start`.
- * Copying will stop once either the end of the current $coll is reached,
- * or the end of the array is reached.
- *
- * $willNotTerminateInf
- *
- * @param xs the array to fill.
- * @param start the starting index.
- * @tparam B the type of the elements of the array.
- *
- * @usecase def copyToArray(xs: Array[A], start: Int): Unit
- */
- def copyToArray[B >: A](xs: Array[B], start: Int) {
- copyToArray(xs, start, xs.length - start)
- }
-
- /** Copies elements of this $coll to an array.
- * Fills the given array `xs` with all elements of
- * this $coll, starting at position `0`.
- * Copying will stop once either the end of the current $coll is reached,
- * or the end of the array is reached.
- *
- * $willNotTerminateInf
- *
- * @param xs the array to fill.
- * @tparam B the type of the elements of the array.
- *
- * @usecase def copyToArray(xs: Array[A], start: Int): Unit
- */
- def copyToArray[B >: A](xs: Array[B]) {
- copyToArray(xs, 0)
- }
-
- /** Converts this $coll to an array.
- * $willNotTerminateInf
- *
- * @tparam B the type of the elements of the array. A `ClassManifest` for this type must
- * be available.
- * @return an array containing all elements of this $coll.
- *
- * @usecase def toArray: Array[A]
- * @return an array containing all elements of this $coll.
- * A `ClassManifest` must be available for the element type of this $coll.
- */
- def toArray[B >: A : ClassManifest]: Array[B] = {
- val result = new Array[B](size)
- copyToArray(result, 0)
- result
- }
-
- /** Converts this $coll to a list.
- * $willNotTerminateInf
- * @return a list containing all elements of this $coll.
- */
- def toList: List[A] = (new ListBuffer[A] ++= thisCollection).toList
-
- /** Converts this $coll to an iterable collection.
- * $willNotTerminateInf
- * @return an `Iterable` containing all elements of this $coll.
- */
- def toIterable: Iterable[A] = toStream
-
- /** Converts this $coll to a sequence.
- * $willNotTerminateInf
- * @return a sequence containing all elements of this $coll.
- */
- def toSeq: Seq[A] = toList
-
- /** Converts this $coll to an indexed sequence.
- * $willNotTerminateInf
- * @return an indexed sequence containing all elements of this $coll.
- */
- def toIndexedSeq[B >: A]: mutable.IndexedSeq[B] = (new ArrayBuffer[B] ++= thisCollection)
-
- /** Converts this $coll to a stream.
- * $willNotTerminateInf
- * @return a stream containing all elements of this $coll.
- */
- def toStream: Stream[A] = toList.toStream
-
- /** Converts this $coll to a set.
- * $willNotTerminateInf
- * @return a set containing all elements of this $coll.
- */
- def toSet[B >: A]: immutable.Set[B] = immutable.Set() ++ thisCollection
-
- /** Converts this $coll to a map. This method is unavailable unless
- * the elements are members of Tuple2, each ((K, V)) becoming a key-value
- * pair in the map. Duplicate keys will be overwritten by later keys:
- * if this is an unordered collection, which key is in the resulting map
- * is undefined.
- * $willNotTerminateInf
- * @return a map containing all elements of this $coll.
- */
- def toMap[T, U](implicit ev: A <:< (T, U)): immutable.Map[T, U] = {
- val b = immutable.Map.newBuilder[T, U]
- for (x <- this)
- b += x
-
- b.result
- }
-
- /** Displays all elements of this $coll in a string using start, end, and separator strings.
- *
- * @param start the starting string.
- * @param sep the separator string.
- * @param end the ending string.
- * @return a string representation of this $coll. The resulting string
- * begins with the string `start` and ends with the string
- * `end`. Inside, the string representations (w.r.t. the method `toString`)
- * of all elements of this $coll are separated by the string `sep`.
- *
- * @ex `List(1, 2, 3).mkString("(", "; ", ")") = "(1; 2; 3)"`
- */
- def mkString(start: String, sep: String, end: String): String =
- addString(new StringBuilder(), start, sep, end).toString
-
- /** Displays all elements of this $coll in a string using a separator string.
- *
- * @param sep the separator string.
- * @return a string representation of this $coll. In the resulting string
- * the string representations (w.r.t. the method `toString`)
- * of all elements of this $coll are separated by the string `sep`.
- *
- * @ex `List(1, 2, 3).mkString("|") = "1|2|3"`
- */
- def mkString(sep: String): String =
- addString(new StringBuilder(), sep).toString
-
- /** Displays all elements of this $coll in a string.
- * @return a string representation of this $coll. In the resulting string
- * the string representations (w.r.t. the method `toString`)
- * of all elements of this $coll follow each other without any separator string.
- */
- def mkString: String =
- addString(new StringBuilder()).toString
-
- /** Appends all elements of this $coll to a string builder using start, end, and separator strings.
- * The written text begins with the string `start` and ends with the string
- * `end`. Inside, the string representations (w.r.t. the method `toString`)
- * of all elements of this $coll are separated by the string `sep`.
- *
- * @param b the string builder to which elements are appended.
- * @param start the starting string.
- * @param sep the separator string.
- * @param end the ending string.
- * @return the string builder `b` to which elements were appended.
- */
- def addString(b: StringBuilder, start: String, sep: String, end: String): StringBuilder = {
- b append start
- var first = true
- for (x <- this) {
- if (first) first = false
- else b append sep
- b append x
- }
- b append end
- }
-
- /** Appends all elements of this $coll to a string builder using a separator string.
- * The written text consists of the string representations (w.r.t. the method `toString`)
- * of all elements of this $coll, separated by the string `sep`.
- *
- * @param b the string builder to which elements are appended.
- * @param sep the separator string.
- * @return the string builder `b` to which elements were appended.
- */
- def addString(b: StringBuilder, sep: String): StringBuilder = addString(b, "", sep, "")
-
- /** Appends all elements of this $coll to a string builder.
- * The written text consists of the string representations (w.r.t. the method `toString`)
- * of all elements of this $coll without any separator string.
- *
- * @param b the string builder to which elements are appended.
- * @return the string builder `b` to which elements were appended.
- */
- def addString(b: StringBuilder): StringBuilder = addString(b, "")
+ def toTraversable: Traversable[A] = thisCollection
+ def toIterator: Iterator[A] = toIterable.iterator
/** Converts this $coll to a string
- * @returns a string representation of this collection. By default this
+ * @return a string representation of this collection. By default this
* string consists of the `stringPrefix` of this $coll,
* followed by all elements separated by commas and enclosed in parentheses.
*/
@@ -1131,7 +714,7 @@ self =>
*/
def view = new TraversableView[A, Repr] {
protected lazy val underlying = self.repr
- override def foreach[B](f: A => B) = self foreach f
+ override def foreach[U](f: A => U) = self foreach f
}
/** Creates a non-strict view of a slice of this $coll.
diff --git a/src/library/scala/collection/TraversableOnce.scala b/src/library/scala/collection/TraversableOnce.scala
new file mode 100644
index 0000000000..6e4917b77e
--- /dev/null
+++ b/src/library/scala/collection/TraversableOnce.scala
@@ -0,0 +1,522 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+package scala.collection
+
+import mutable.{ Buffer, ListBuffer, ArrayBuffer }
+
+/** A template trait for collections which can be traversed one
+ * or more times.
+ * $traversableonceinfo
+ *
+ * @tparam A the element type of the collection
+ *
+ * @define traversableonceinfo
+ * This trait is composed of those methods which can be implemented
+ * solely in terms of foreach and which do not need access to a Builder.
+ * It represents the implementations common to Iterators and
+ * Traversables, such as folds, conversions, and other operations which
+ * traverse some or all of the elements and return a derived value.
+ *
+ * @author Martin Odersky
+ * @author Paul Phillips
+ * @version 2.8
+ * @since 2.8
+ *
+ * @define coll traversable or iterator
+ * @define orderDependentFold
+ *
+ * Note: might return different results for different runs, unless the underlying collection type is ordered.
+ * or the operator is associative and commutative.
+ * @define willNotTerminateInf
+ *
+ * Note: will not terminate for infinite-sized collections.
+ */
+trait TraversableOnce[+A] {
+ self =>
+
+ /** Self-documenting abstract methods. */
+ def foreach[U](f: A => U): Unit
+ def isEmpty: Boolean
+ def hasDefiniteSize: Boolean
+
+ /** Tests whether this $coll can be repeatedly traversed. Always
+ * true for Traversables and false for Iterators unless overridden.
+ *
+ * @return `true` if it is repeatedly traversable, `false` otherwise.
+ */
+ def isTraversableAgain: Boolean
+
+ /** Returns an Iterator over the elements in this $coll. Will return
+ * the same Iterator if this instance is already an Iterator.
+ * $willNotTerminateInf
+ * @return an Iterator containing all elements of this $coll.
+ */
+ def toIterator: Iterator[A]
+
+ /** Converts this $coll to an unspecified Traversable. Will return
+ * the same collection if this instance is already Traversable.
+ * $willNotTerminateInf
+ * @return a Traversable containing all elements of this $coll.
+ */
+ def toTraversable: Traversable[A]
+
+ /** Presently these are abstract because the Traversable versions use
+ * breakable/break, and I wasn't sure enough of how that's supposed to
+ * function to consolidate them with the Iterator versions.
+ */
+ def forall(p: A => Boolean): Boolean
+ def exists(p: A => Boolean): Boolean
+ def find(p: A => Boolean): Option[A]
+ def copyToArray[B >: A](xs: Array[B], start: Int, len: Int): Unit
+ // def mapFind[B](f: A => Option[B]): Option[B]
+
+ // for internal use
+ protected[this] def reversed = {
+ var elems: List[A] = Nil
+ self foreach (elems ::= _)
+ elems
+ }
+
+ /** The size of this $coll.
+ *
+ * $willNotTerminateInf
+ *
+ * @return the number of elements in this $coll.
+ */
+ def size: Int = {
+ var result = 0
+ for (x <- self) result += 1
+ result
+ }
+
+ /** Tests whether the $coll is not empty.
+ *
+ * @return `true` if the $coll contains at least one element, `false` otherwise.
+ */
+ def nonEmpty: Boolean = !isEmpty
+
+ /** Counts the number of elements in the $coll which satisfy a predicate.
+ *
+ * @param p the predicate used to test elements.
+ * @return the number of elements satisfying the predicate `p`.
+ */
+ def count(p: A => Boolean): Int = {
+ var cnt = 0
+ for (x <- this)
+ if (p(x)) cnt += 1
+
+ cnt
+ }
+
+ /** Applies a binary operator to a start value and all elements of this $coll, going left to right.
+ *
+ * Note: `/:` is alternate syntax for `foldLeft`; `z /: xs` is the same as `xs foldLeft z`.
+ * $willNotTerminateInf
+ * $orderDependentFold
+ *
+ * @param z the start value.
+ * @param op the binary operator.
+ * @tparam B the result type of the binary operator.
+ * @return the result of inserting `op` between consecutive elements of this $coll$,
+ * going left to right with the start value `z` on the left:
+ * {{{
+ * op(...op(op(z, x,,1,,), x,,2,,), ..., x,,n,,)
+ * }}}
+ * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
+ */
+ def /:[B](z: B)(op: (B, A) => B): B = foldLeft(z)(op)
+
+ /** Applies a binary operator to all elements of this $coll and a start value, going right to left.
+ *
+ * Note: `:\` is alternate syntax for `foldRight`; `xs :\ z` is the same as `xs foldRight z`.
+ * $willNotTerminateInf
+ * $orderDependentFold
+ *
+ * @param z the start value
+ * @param op the binary operator
+ * @tparam B the result type of the binary operator.
+ * @return the result of inserting `op` between consecutive elements of this $coll$,
+ * going right to left with the start value `z` on the right:
+ * {{{
+ * op(x,,1,,, op(x,,2,,, ... op(x,,n,,, z)...))
+ * }}}
+ * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
+ */
+ def :\[B](z: B)(op: (A, B) => B): B = foldRight(z)(op)
+
+ /** Applies a binary operator to a start value and all elements of this $coll, going left to right.
+ *
+ * $willNotTerminateInf
+ * $orderDependentFold
+ *
+ * @param z the start value.
+ * @param op the binary operator.
+ * @tparam B the result type of the binary operator.
+ * @return the result of inserting `op` between consecutive elements of this $coll$,
+ * going left to right with the start value `z` on the left:
+ * {{{
+ * op(...op(z, x,,1,,), x,,2,,, ..., x,,n,,)
+ * }}}
+ * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
+ */
+ def foldLeft[B](z: B)(op: (B, A) => B): B = {
+ var result = z
+ this foreach (x => result = op(result, x))
+ result
+ }
+
+ /** Applies a binary operator to all elements of this $coll and a start value, going right to left.
+ *
+ * $willNotTerminateInf
+ * $orderDependentFold
+ * @param z the start value.
+ * @param op the binary operator.
+ * @tparam B the result type of the binary operator.
+ * @return the result of inserting `op` between consecutive elements of this $coll$,
+ * going right to left with the start value `z` on the right:
+ * {{{
+ * op(x,,1,,, op(x,,2,,, ... op(x,,n,,, z)...))
+ * }}}
+ * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
+ */
+ def foldRight[B](z: B)(op: (A, B) => B): B =
+ reversed.foldLeft(z)((x, y) => op(y, x))
+
+ /** Applies a binary operator to all elements of this $coll, going left to right.
+ * $willNotTerminateInf
+ * $orderDependentFold
+ *
+ * @param op the binary operator.
+ * @tparam B the result type of the binary operator.
+ * @return the result of inserting `op` between consecutive elements of this $coll$,
+ * going left to right:
+ * {{{
+ * op(...(op(x,,1,,, x,,2,,), ... ) , x,,n,,)
+ * }}}
+ * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
+ * @throws `UnsupportedOperationException` if this $coll is empty.
+ */
+ def reduceLeft[B >: A](op: (B, A) => B): B = {
+ if (isEmpty)
+ throw new UnsupportedOperationException("empty.reduceLeft")
+
+ var first = true
+ var acc: B = 0.asInstanceOf[B]
+
+ for (x <- self) {
+ if (first) {
+ acc = x
+ first = false
+ }
+ else acc = op(acc, x)
+ }
+ acc
+ }
+
+ /** Applies a binary operator to all elements of this $coll, going right to left.
+ * $willNotTerminateInf
+ * $orderDependentFold
+ *
+ * @param op the binary operator.
+ * @tparam B the result type of the binary operator.
+ * @return the result of inserting `op` between consecutive elements of this $coll$,
+ * going right to left:
+ * {{{
+ * op(x,,1,,, op(x,,2,,, ..., op(x,,n-1,,, x,,n,,)...))
+ * }}}
+ * where `x,,1,,, ..., x,,n,,` are the elements of this $coll.
+ * @throws `UnsupportedOperationException` if this $coll is empty.
+ */
+ def reduceRight[B >: A](op: (A, B) => B): B = {
+ if (isEmpty)
+ throw new UnsupportedOperationException("empty.reduceRight")
+
+ reversed.reduceLeft[B]((x, y) => op(y, x))
+ }
+
+ /** Optionally applies a binary operator to all elements of this $coll, going left to right.
+ * $willNotTerminateInf
+ * $orderDependentFold
+ *
+ * @param op the binary operator.
+ * @tparam B the result type of the binary operator.
+ * @return an option value containing the result of `reduceLeft(op)` is this $coll is nonempty,
+ * `None` otherwise.
+ */
+ def reduceLeftOption[B >: A](op: (B, A) => B): Option[B] =
+ if (isEmpty) None else Some(reduceLeft(op))
+
+ /** Optionally applies a binary operator to all elements of this $coll, going right to left.
+ * $willNotTerminateInf
+ * $orderDependentFold
+ *
+ * @param op the binary operator.
+ * @tparam B the result type of the binary operator.
+ * @return an option value containing the result of `reduceRight(op)` is this $coll is nonempty,
+ * `None` otherwise.
+ */
+ def reduceRightOption[B >: A](op: (A, B) => B): Option[B] =
+ if (isEmpty) None else Some(reduceRight(op))
+
+ /** Sums up the elements of this collection.
+ *
+ * @param num an implicit parameter defining a set of numeric operations
+ * which includes the `+` operator to be used in forming the sum.
+ * @tparam B the result type of the `+` operator.
+ * @return the sum of all elements of this $coll with respect to the `+` operator in `num`.
+ *
+ * @usecase def sum: Int
+ *
+ * @return the sum of all elements in this $coll of numbers of type `Int`.
+ * Instead of `Int`, any other type `T` with an implicit `Numeric[T]` implementation
+ * can be used as element type of the $coll and as result type of `sum`.
+ * Examples of such types are: `Long`, `Float`, `Double`, `BigInt`.
+ *
+ */
+ def sum[B >: A](implicit num: Numeric[B]): B = foldLeft(num.zero)(num.plus)
+
+ /** Multiplies up the elements of this collection.
+ *
+ * @param num an implicit parameter defining a set of numeric operations
+ * which includes the `*` operator to be used in forming the product.
+ * @tparam B the result type of the `*` operator.
+ * @return the product of all elements of this $coll with respect to the `*` operator in `num`.
+ *
+ * @usecase def product: Int
+ *
+ * @return the product of all elements in this $coll of numbers of type `Int`.
+ * Instead of `Int`, any other type `T` with an implicit `Numeric[T]` implementation
+ * can be used as element type of the $coll and as result type of `product`.
+ * Examples of such types are: `Long`, `Float`, `Double`, `BigInt`.
+ */
+ def product[B >: A](implicit num: Numeric[B]): B = foldLeft(num.one)(num.times)
+
+ /** Finds the smallest element.
+ *
+ * @param cmp An ordering to be used for comparing elements.
+ * @tparam B The type over which the ordering is defined.
+ * @return the smallest element of this $coll with respect to the ordering `cmp`.
+ *
+ * @usecase def min: A
+ * @return the smallest element of this $coll
+ */
+ def min[B >: A](implicit cmp: Ordering[B]): A = {
+ if (isEmpty)
+ throw new UnsupportedOperationException("empty.min")
+
+ reduceLeft((x, y) => if (cmp.lteq(x, y)) x else y)
+ }
+
+ /** Finds the largest element.
+ *
+ * @param cmp An ordering to be used for comparing elements.
+ * @tparam B The type over which the ordering is defined.
+ * @return the largest element of this $coll with respect to the ordering `cmp`.
+ *
+ * @usecase def max: A
+ * @return the largest element of this $coll.
+ */
+ def max[B >: A](implicit cmp: Ordering[B]): A = {
+ if (isEmpty)
+ throw new UnsupportedOperationException("empty.max")
+
+ reduceLeft((x, y) => if (cmp.gteq(x, y)) x else y)
+ }
+
+ /** Copies all elements of this $coll to a buffer.
+ * $willNotTerminateInf
+ * @param dest The buffer to which elements are copied.
+ */
+ def copyToBuffer[B >: A](dest: Buffer[B]): Unit = dest ++= self
+
+ /** Copies values produced by this iterator to an array.
+ * Fills the given array `xs` with values produced by this iterator, after skipping `start` values.
+ * Copying will stop once either the end of the current iterator is reached,
+ * or the end of the array is reached.
+ *
+ * $willNotTerminateInf
+ *
+ * @param xs the array to fill.
+ * @param start the starting index.
+ * @tparam B the type of the elements of the array.
+ *
+ * @usecase def copyToArray(xs: Array[A], start: Int, len: Int): Unit
+ */
+ def copyToArray[B >: A](xs: Array[B], start: Int): Unit =
+ copyToArray(xs, start, xs.length - start)
+
+ /** Copies values produced by this iterator to an array.
+ * Fills the given array `xs` with values produced by this iterator.
+ * Copying will stop once either the end of the current iterator is reached,
+ * or the end of the array is reached.
+ *
+ * $willNotTerminateInf
+ *
+ * @param xs the array to fill.
+ * @tparam B the type of the elements of the array.
+ *
+ * @usecase def copyToArray(xs: Array[A], start: Int, len: Int): Unit
+ */
+ def copyToArray[B >: A](xs: Array[B]): Unit =
+ copyToArray(xs, 0, xs.length)
+
+ /** Converts this $coll to an array.
+ * $willNotTerminateInf
+ *
+ * @tparam B the type of the elements of the array. A `ClassManifest` for this type must
+ * be available.
+ * @return an array containing all elements of this $coll.
+ *
+ * @usecase def toArray: Array[A]
+ * @return an array containing all elements of this $coll.
+ * A `ClassManifest` must be available for the element type of this $coll.
+ */
+ def toArray[B >: A : ClassManifest]: Array[B] = {
+ if (isTraversableAgain) {
+ val result = new Array[B](size)
+ copyToArray(result, 0)
+ result
+ }
+ else toStream.toArray
+ }
+
+ /** Converts this $coll to a list.
+ * $willNotTerminateInf
+ * @return a list containing all elements of this $coll.
+ */
+ def toList: List[A] = new ListBuffer[A] ++= self toList
+
+ /** Converts this $coll to an iterable collection.
+ * $willNotTerminateInf
+ * @return an `Iterable` containing all elements of this $coll.
+ */
+ def toIterable: Iterable[A] = toStream
+
+ /** Converts this $coll to a sequence.
+ * $willNotTerminateInf
+ * @return a sequence containing all elements of this $coll.
+ */
+ def toSeq: Seq[A] = toList
+
+ /** Converts this $coll to an indexed sequence.
+ * $willNotTerminateInf
+ * @return an indexed sequence containing all elements of this $coll.
+ */
+ def toIndexedSeq[B >: A]: mutable.IndexedSeq[B] = new ArrayBuffer[B] ++= self
+
+ /** Converts this $coll to a stream.
+ * $willNotTerminateInf
+ * @return a stream containing all elements of this $coll.
+ */
+ def toStream: Stream[A] = toList.toStream
+
+ /** Converts this $coll to a set.
+ * $willNotTerminateInf
+ * @return a set containing all elements of this $coll.
+ */
+ def toSet[B >: A]: immutable.Set[B] = immutable.Set() ++ self
+
+ /** Converts this $coll to a map. This method is unavailable unless
+ * the elements are members of Tuple2, each ((K, V)) becoming a key-value
+ * pair in the map. Duplicate keys will be overwritten by later keys:
+ * if this is an unordered collection, which key is in the resulting map
+ * is undefined.
+ * $willNotTerminateInf
+ * @return a map containing all elements of this $coll.
+ */
+ def toMap[T, U](implicit ev: A <:< (T, U)): immutable.Map[T, U] = {
+ val b = immutable.Map.newBuilder[T, U]
+ for (x <- self)
+ b += x
+
+ b.result
+ }
+
+ /** Displays all elements of this $coll in a string using start, end, and separator strings.
+ *
+ * @param start the starting string.
+ * @param sep the separator string.
+ * @param end the ending string.
+ * @return a string representation of this $coll. The resulting string
+ * begins with the string `start` and ends with the string
+ * `end`. Inside, the string representations (w.r.t. the method `toString`)
+ * of all elements of this $coll are separated by the string `sep`.
+ *
+ * @example `List(1, 2, 3).mkString("(", "; ", ")") = "(1; 2; 3)"`
+ */
+ def mkString(start: String, sep: String, end: String): String =
+ addString(new StringBuilder(), start, sep, end).toString
+
+ /** Displays all elements of this $coll in a string using a separator string.
+ *
+ * @param sep the separator string.
+ * @return a string representation of this $coll. In the resulting string
+ * the string representations (w.r.t. the method `toString`)
+ * of all elements of this $coll are separated by the string `sep`.
+ *
+ * @example `List(1, 2, 3).mkString("|") = "1|2|3"`
+ */
+ def mkString(sep: String): String = mkString("", sep, "")
+
+ /** Displays all elements of this $coll in a string.
+ * @return a string representation of this $coll. In the resulting string
+ * the string representations (w.r.t. the method `toString`)
+ * of all elements of this $coll follow each other without any separator string.
+ */
+ def mkString: String = mkString("")
+
+ /** Appends all elements of this $coll to a string builder using start, end, and separator strings.
+ * The written text begins with the string `start` and ends with the string
+ * `end`. Inside, the string representations (w.r.t. the method `toString`)
+ * of all elements of this $coll are separated by the string `sep`.
+ *
+ * @param b the string builder to which elements are appended.
+ * @param start the starting string.
+ * @param sep the separator string.
+ * @param end the ending string.
+ * @return the string builder `b` to which elements were appended.
+ */
+ def addString(b: StringBuilder, start: String, sep: String, end: String): StringBuilder = {
+ var first = true
+
+ b append start
+ for (x <- self) {
+ if (first) {
+ b append x
+ first = false
+ }
+ else {
+ b append sep
+ b append x
+ }
+ }
+ b append end
+
+ b
+ }
+
+ /** Appends all elements of this $coll to a string builder using a separator string.
+ * The written text consists of the string representations (w.r.t. the method `toString`)
+ * of all elements of this $coll, separated by the string `sep`.
+ *
+ * @param b the string builder to which elements are appended.
+ * @param sep the separator string.
+ * @return the string builder `b` to which elements were appended.
+ */
+ def addString(b: StringBuilder, sep: String): StringBuilder = addString(b, "", sep, "")
+
+ /** Appends all elements of this $coll to a string builder.
+ * The written text consists of the string representations (w.r.t. the method `toString`)
+ * of all elements of this $coll without any separator string.
+ *
+ * @param b the string builder to which elements are appended.
+ * @return the string builder `b` to which elements were appended.
+ */
+ def addString(b: StringBuilder): StringBuilder = addString(b, "")
+}
diff --git a/src/library/scala/collection/TraversableProxy.scala b/src/library/scala/collection/TraversableProxy.scala
index 4a14937781..dd450dccac 100644
--- a/src/library/scala/collection/TraversableProxy.scala
+++ b/src/library/scala/collection/TraversableProxy.scala
@@ -11,7 +11,7 @@
package scala.collection
-// Methods could be printed by cat TraversibeLike.scala | egrep '^ (override )?def'
+// Methods could be printed by cat TraverableLike.scala | egrep '^ (override )?def'
/** This trait implements a proxy for traversable objects. It forwards
diff --git a/src/library/scala/collection/TraversableProxyLike.scala b/src/library/scala/collection/TraversableProxyLike.scala
index 24d6c7048d..fb8da98a6b 100644
--- a/src/library/scala/collection/TraversableProxyLike.scala
+++ b/src/library/scala/collection/TraversableProxyLike.scala
@@ -24,23 +24,22 @@ import mutable.{Buffer, StringBuilder}
* @version 2.8
* @since 2.8
*/
-trait TraversableProxyLike[+A, +This <: TraversableLike[A, This] with Traversable[A]] extends TraversableLike[A, This] with Proxy {
- def self: This
+trait TraversableProxyLike[+A, +Repr <: TraversableLike[A, Repr] with Traversable[A]] extends TraversableLike[A, Repr] with Proxy {
+ def self: Repr
override def foreach[B](f: A => B): Unit = self.foreach(f)
override def isEmpty: Boolean = self.isEmpty
override def nonEmpty: Boolean = self.nonEmpty
override def size: Int = self.size
override def hasDefiniteSize = self.hasDefiniteSize
- override def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That = self.++(that)(bf)
- override def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[This, B, That]): That = self.++(that)(bf)
- override def map[B, That](f: A => B)(implicit bf: CanBuildFrom[This, B, That]): That = self.map(f)(bf)
- override def flatMap[B, That](f: A => Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That = self.flatMap(f)(bf)
- override def partialMap[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[This, B, That]): That = self.partialMap(pf)(bf)
- override def filter(p: A => Boolean): This = self.filter(p)
- override def filterNot(p: A => Boolean): This = self.filterNot(p)
- override def partition(p: A => Boolean): (This, This) = self.partition(p)
- override def groupBy[K](f: A => K): Map[K, This] = self.groupBy(f)
+ override def ++[B >: A, That](xs: TraversableOnce[B])(implicit bf: CanBuildFrom[Repr, B, That]): That = self.++(xs)(bf)
+ override def map[B, That](f: A => B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.map(f)(bf)
+ override def flatMap[B, That](f: A => Traversable[B])(implicit bf: CanBuildFrom[Repr, B, That]): That = self.flatMap(f)(bf)
+ override def filter(p: A => Boolean): Repr = self.filter(p)
+ override def filterNot(p: A => Boolean): Repr = self.filterNot(p)
+ override def collect[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[Repr, B, That]): That = self.collect(pf)(bf)
+ override def partition(p: A => Boolean): (Repr, Repr) = self.partition(p)
+ override def groupBy[K](f: A => K): Map[K, Repr] = self.groupBy(f)
override def forall(p: A => Boolean): Boolean = self.forall(p)
override def exists(p: A => Boolean): Boolean = self.exists(p)
override def count(p: A => Boolean): Int = self.count(p)
@@ -53,28 +52,37 @@ trait TraversableProxyLike[+A, +This <: TraversableLike[A, This] with Traversabl
override def reduceLeftOption[B >: A](op: (B, A) => B): Option[B] = self.reduceLeftOption(op)
override def reduceRight[B >: A](op: (A, B) => B): B = self.reduceRight(op)
override def reduceRightOption[B >: A](op: (A, B) => B): Option[B] = self.reduceRightOption(op)
+ override def scanLeft[B, That](z: B)(op: (B, A) => B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.scanLeft(z)(op)(bf)
+ override def scanRight[B, That](z: B)(op: (A, B) => B)(implicit bf: CanBuildFrom[Repr, B, That]): That = self.scanRight(z)(op)(bf)
+ override def sum[B >: A](implicit num: Numeric[B]): B = self.sum(num)
+ override def product[B >: A](implicit num: Numeric[B]): B = self.product(num)
+ override def min[B >: A](implicit cmp: Ordering[B]): A = self.min(cmp)
+ override def max[B >: A](implicit cmp: Ordering[B]): A = self.max(cmp)
override def head: A = self.head
override def headOption: Option[A] = self.headOption
- override def tail: This = self.tail
+ override def tail: Repr = self.tail
override def last: A = self.last
override def lastOption: Option[A] = self.lastOption
- override def init: This = self.init
- override def take(n: Int): This = self.take(n)
- override def drop(n: Int): This = self.drop(n)
- override def slice(from: Int, until: Int): This = self.slice(from, until)
- override def takeWhile(p: A => Boolean): This = self.takeWhile(p)
- override def dropWhile(p: A => Boolean): This = self.dropWhile(p)
- override def span(p: A => Boolean): (This, This) = self.span(p)
- override def splitAt(n: Int): (This, This) = self.splitAt(n)
+ override def init: Repr = self.init
+ override def take(n: Int): Repr = self.take(n)
+ override def drop(n: Int): Repr = self.drop(n)
+ override def slice(from: Int, until: Int): Repr = self.slice(from, until)
+ override def takeWhile(p: A => Boolean): Repr = self.takeWhile(p)
+ override def dropWhile(p: A => Boolean): Repr = self.dropWhile(p)
+ override def span(p: A => Boolean): (Repr, Repr) = self.span(p)
+ override def splitAt(n: Int): (Repr, Repr) = self.splitAt(n)
override def copyToBuffer[B >: A](dest: Buffer[B]) = self.copyToBuffer(dest)
override def copyToArray[B >: A](xs: Array[B], start: Int, len: Int) = self.copyToArray(xs, start, len)
override def copyToArray[B >: A](xs: Array[B], start: Int) = self.copyToArray(xs, start)
+ override def copyToArray[B >: A](xs: Array[B]) = self.copyToArray(xs)
override def toArray[B >: A: ClassManifest]: Array[B] = self.toArray
override def toList: List[A] = self.toList
override def toIterable: Iterable[A] = self.toIterable
override def toSeq: Seq[A] = self.toSeq
+ override def toIndexedSeq[B >: A]: mutable.IndexedSeq[B] = self.toIndexedSeq
override def toStream: Stream[A] = self.toStream
override def toSet[B >: A]: immutable.Set[B] = self.toSet
+ override def toMap[T, U](implicit ev: A <:< (T, U)): immutable.Map[T, U] = self.toMap(ev)
override def mkString(start: String, sep: String, end: String): String = self.mkString(start, sep, end)
override def mkString(sep: String): String = self.mkString(sep)
override def mkString: String = self.mkString
@@ -83,14 +91,18 @@ trait TraversableProxyLike[+A, +This <: TraversableLike[A, This] with Traversabl
override def addString(b: StringBuilder): StringBuilder = self.addString(b)
override def stringPrefix : String = self.stringPrefix
override def view = self.view
- override def view(from: Int, until: Int): TraversableView[A, This] = self.view(from, until)
+ override def view(from: Int, until: Int): TraversableView[A, Repr] = self.view(from, until)
}
-private class TraversableProxyLikeConfirmation[+A, +This <: TraversableLike[A, This] with Traversable[A]]
+/** Martin to Paul: I'm not sure what the purpose of this class is? I assume it was to make
+ * sure that TraversableProxyLike has all Traversable methods, but it fails at that
+ *
+private class TraversableProxyLikeConfirmation[+A, +Repr <: TraversableLike[A, Repr] with Traversable[A]]
extends TraversableProxyLike[A, Traversable[A]]
with interfaces.TraversableMethods[A, Traversable[A]]
{
- def self: This = repr.asInstanceOf[This]
+ def self: Repr = repr.asInstanceOf[Repr]
protected[this] def newBuilder = scala.collection.Traversable.newBuilder[A]
- // : Builder[A, This]
+ // : Builder[A, Repr]
}
+*/
diff --git a/src/library/scala/collection/TraversableView.scala b/src/library/scala/collection/TraversableView.scala
index 8a67b8d10f..e9332097e7 100644
--- a/src/library/scala/collection/TraversableView.scala
+++ b/src/library/scala/collection/TraversableView.scala
@@ -18,7 +18,7 @@ import TraversableView.NoBuilder
/** <p>
* A base class for views of <a href="../Traversable.html"
* target="ContentFrame"><code>Traversable<code></a>.<br/>
- * Every subclass has to implenment the <code>foreach</code> method.
+ * Every subclass has to implement the <code>foreach</code> method.
* </p>
*
* @author Martin Odersky
diff --git a/src/library/scala/collection/TraversableViewLike.scala b/src/library/scala/collection/TraversableViewLike.scala
index 84c33296db..09e6a65158 100644
--- a/src/library/scala/collection/TraversableViewLike.scala
+++ b/src/library/scala/collection/TraversableViewLike.scala
@@ -12,7 +12,7 @@
package scala.collection
import generic._
-import mutable.Builder
+import mutable.{Builder, ArrayBuffer}
import TraversableView.NoBuilder
/** <p>
@@ -47,8 +47,23 @@ self =>
b.result()
}
+ /** The implementation base trait of this view.
+ * This trait and all its subtraits has to be re-implemented for each
+ * ViewLike class.
+ */
trait Transformed[+B] extends TraversableView[B, Coll] {
lazy val underlying = self.underlying
+ override def toString = stringPrefix+"(...)"
+ }
+
+ /** A fall back which forces everything into a vector and then applies an operation
+ * on it. Used for those operations which do not naturally lend themselves to a view
+ */
+ trait Forced[B] extends Transformed[B] {
+ protected[this] def forced: Seq[B]
+ private[this] lazy val forcedCache = forced
+ override def foreach[U](f: B => U) = forcedCache.foreach(f)
+ override def stringPrefix = self.stringPrefix+"C"
}
/** pre: from >= 0
@@ -56,7 +71,7 @@ self =>
trait Sliced extends Transformed[A] {
protected[this] val from: Int
protected[this] val until: Int
- override def foreach[C](f: A => C) {
+ override def foreach[U](f: A => U) {
var index = 0
for (x <- self) {
if (from <= index) {
@@ -73,7 +88,7 @@ self =>
trait Mapped[B] extends Transformed[B] {
protected[this] val mapping: A => B
- override def foreach[C](f: B => C) {
+ override def foreach[U](f: B => U) {
for (x <- self)
f(mapping(x))
}
@@ -82,7 +97,7 @@ self =>
trait FlatMapped[B] extends Transformed[B] {
protected[this] val mapping: A => Traversable[B]
- override def foreach[C](f: B => C) {
+ override def foreach[U](f: B => U) {
for (x <- self)
for (y <- mapping(x))
f(y)
@@ -92,7 +107,7 @@ self =>
trait Appended[B >: A] extends Transformed[B] {
protected[this] val rest: Traversable[B]
- override def foreach[C](f: B => C) {
+ override def foreach[U](f: B => U) {
for (x <- self) f(x)
for (x <- rest) f(x)
}
@@ -101,7 +116,7 @@ self =>
trait Filtered extends Transformed[A] {
protected[this] val pred: A => Boolean
- override def foreach[C](f: A => C) {
+ override def foreach[U](f: A => U) {
for (x <- self)
if (pred(x)) f(x)
}
@@ -110,7 +125,7 @@ self =>
trait TakenWhile extends Transformed[A] {
protected[this] val pred: A => Boolean
- override def foreach[C](f: A => C) {
+ override def foreach[U](f: A => U) {
for (x <- self) {
if (!pred(x)) return
f(x)
@@ -121,7 +136,7 @@ self =>
trait DroppedWhile extends Transformed[A] {
protected[this] val pred: A => Boolean
- override def foreach[C](f: A => C) {
+ override def foreach[U](f: A => U) {
var go = false
for (x <- self) {
if (!go && !pred(x)) go = true
@@ -134,6 +149,7 @@ self =>
/** Boilerplate method, to override in each subclass
* This method could be eliminated if Scala had virtual classes
*/
+ protected def newForced[B](xs: => Seq[B]): Transformed[B] = new Forced[B] { val forced = xs }
protected def newAppended[B >: A](that: Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
protected def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
protected def newFlatMapped[B](f: A => Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
@@ -142,14 +158,12 @@ self =>
protected def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
protected def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
- override def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That = {
- newAppended(that).asInstanceOf[That]
+ override def ++[B >: A, That](xs: TraversableOnce[B])(implicit bf: CanBuildFrom[This, B, That]): That = {
+ newAppended(xs.toTraversable).asInstanceOf[That]
// was: if (bf.isInstanceOf[ByPassCanBuildFrom]) newAppended(that).asInstanceOf[That]
// else super.++[B, That](that)(bf)
}
- override def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[This, B, That]): That = ++[B, That](that.toStream)
-
override def map[B, That](f: A => B)(implicit bf: CanBuildFrom[This, B, That]): That = {
newMapped(f).asInstanceOf[That]
// val b = bf(repr)
@@ -157,6 +171,9 @@ self =>
// else super.map[B, That](f)(bf)
}
+ override def collect[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[This, B, That]): That =
+ filter(pf.isDefinedAt).map(pf)(bf)
+
override def flatMap[B, That](f: A => Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That = {
newFlatMapped(f).asInstanceOf[That]
// was: val b = bf(repr)
@@ -164,7 +181,14 @@ self =>
// else super.flatMap[B, That](f)(bf)
}
+ protected[this] def thisSeq: Seq[A] = {
+ val buf = new ArrayBuffer[A]
+ self foreach (buf +=)
+ buf.result
+ }
+
override def filter(p: A => Boolean): This = newFiltered(p).asInstanceOf[This]
+ override def partition(p: A => Boolean): (This, This) = (filter(p), filter(!p(_)))
override def init: This = newSliced(0, size - 1).asInstanceOf[This]
override def drop(n: Int): This = newSliced(n max 0, Int.MaxValue).asInstanceOf[This]
override def take(n: Int): This = newSliced(0, n).asInstanceOf[This]
@@ -173,5 +197,17 @@ self =>
override def takeWhile(p: A => Boolean): This = newTakenWhile(p).asInstanceOf[This]
override def span(p: A => Boolean): (This, This) = (takeWhile(p), dropWhile(p))
override def splitAt(n: Int): (This, This) = (take(n), drop(n))
+
+ override def scanLeft[B, That](z: B)(op: (B, A) => B)(implicit bf: CanBuildFrom[This, B, That]): That =
+ newForced(thisSeq.scanLeft(z)(op)).asInstanceOf[That]
+
+ override def scanRight[B, That](z: B)(op: (A, B) => B)(implicit bf: CanBuildFrom[This, B, That]): That =
+ newForced(thisSeq.scanRight(z)(op)).asInstanceOf[That]
+
+ override def groupBy[K](f: A => K): Map[K, This] =
+ thisSeq.groupBy(f).mapValues(xs => newForced(xs).asInstanceOf[This])
+
override def stringPrefix = "TraversableView"
}
+
+
diff --git a/src/library/scala/collection/generic/Addable.scala b/src/library/scala/collection/generic/Addable.scala
index 9686e96c09..ecbd8301b6 100644
--- a/src/library/scala/collection/generic/Addable.scala
+++ b/src/library/scala/collection/generic/Addable.scala
@@ -52,16 +52,5 @@ trait Addable[A, +Repr <: Addable[A, Repr]] { self =>
* @param elems the collection containing the added elements.
* @return a new $coll with the given elements added.
*/
- def ++ (elems: Traversable[A]): Repr = (repr /: elems) (_ + _)
-
- /** Creates a new $coll by adding all elements produced by an iterator to this $coll.
- *
- * @param iter the iterator producing the added elements.
- * @return a new $coll with the given elements added.
- */
- def ++ (iter: Iterator[A]): Repr = (repr /: iter) (_ + _)
+ def ++ (xs: TraversableOnce[A]): Repr = (repr /: xs) (_ + _)
}
-
-
-
-
diff --git a/src/library/scala/collection/generic/GenericTraversableTemplate.scala b/src/library/scala/collection/generic/GenericTraversableTemplate.scala
index 0e3c3c203b..683f609686 100644
--- a/src/library/scala/collection/generic/GenericTraversableTemplate.scala
+++ b/src/library/scala/collection/generic/GenericTraversableTemplate.scala
@@ -100,8 +100,8 @@ trait GenericTraversableTemplate[+A, +CC[X] <: Traversable[X]] extends HasNewBui
}
/** Transposes this $coll of traversable collections into
- * @B the type of the elements of each traversable collection.
- * @asTraversable an implicit conversion which asserts that the element type of this
+ * @tparam B the type of the elements of each traversable collection.
+ * @param asTraversable an implicit conversion which asserts that the element type of this
* $coll is a `Traversable`.
* @return a two-dimensional $coll of ${coll}s which has as ''n''th row
* the ''n''th column of this $coll.
diff --git a/src/library/scala/collection/generic/Growable.scala b/src/library/scala/collection/generic/Growable.scala
index 7950dee9de..80f933a901 100644
--- a/src/library/scala/collection/generic/Growable.scala
+++ b/src/library/scala/collection/generic/Growable.scala
@@ -16,7 +16,6 @@ package generic
* a `clear` method.
*
* @author Martin Odersky
- * @owner Martin Odersky
* @version 2.8
* @since 2.8
* @define coll growable collection
@@ -42,26 +41,15 @@ trait Growable[-A] {
*/
def +=(elem1: A, elem2: A, elems: A*): this.type = this += elem1 += elem2 ++= elems
- /** ${Add}s all elements produced by an iterator to this $coll.
+ /** ${Add}s all elements produced by a TraversableOnce to this $coll.
*
- * @param iter the iterator producing the elements to $add.
+ * @param iter the TraversableOnce producing the elements to $add.
* @return the $coll itself.
*/
- def ++=(iter: Iterator[A]): this.type = { iter foreach += ; this }
-
- /** ${Add}s all elements contained in a traversable collection to this $coll.
- *
- * @param elems the collection containing the elements to $add.
- * @return the $coll itself.
- */
- def ++=(elems: Traversable[A]): this.type = { elems foreach +=; this }
+ def ++=(xs: TraversableOnce[A]): this.type = { xs foreach += ; this }
/** Clears the $coll's contents. After this operation, the
* $coll is empty.
*/
def clear()
}
-
-
-
-
diff --git a/src/library/scala/collection/generic/IterableForwarder.scala b/src/library/scala/collection/generic/IterableForwarder.scala
index 9cd1ccd843..f4aef2fcbb 100644
--- a/src/library/scala/collection/generic/IterableForwarder.scala
+++ b/src/library/scala/collection/generic/IterableForwarder.scala
@@ -22,7 +22,7 @@ import collection.mutable.Buffer
* <li><code>toString</code>, <code>hashCode</code>, <code>equals</code>,
* <code>stringPrefix</code></li>
* <li><code>newBuilder</code>, <code>view</code></li>
- * <li>all calls creating a new iterable objetc of the same kind</li>
+ * <li>all calls creating a new iterable object of the same kind</li>
* </ul>
* <p>
* The above methods are forwarded by subclass <a href="../IterableProxy.html"
@@ -41,6 +41,6 @@ trait IterableForwarder[+A] extends Iterable[A] with TraversableForwarder[A] {
// Iterable delegates
// Iterable methods could be printed by cat IterableLike.scala | sed -n '/trait Iterable/,$ p' | egrep '^ (override )?def'
- override def iterator = underlying.iterator
+ override def iterator: Iterator[A] = underlying.iterator
override def sameElements[B >: A](that: Iterable[B]): Boolean = underlying.sameElements(that)
}
diff --git a/src/library/scala/collection/generic/SeqForwarder.scala b/src/library/scala/collection/generic/SeqForwarder.scala
index 0ecdaf4566..e5dbe4b79d 100644
--- a/src/library/scala/collection/generic/SeqForwarder.scala
+++ b/src/library/scala/collection/generic/SeqForwarder.scala
@@ -30,24 +30,31 @@ trait SeqForwarder[+A] extends Seq[A] with IterableForwarder[A] {
protected override def underlying: Seq[A]
- // PartialFunction delegates
-
- override def apply(i: Int): A = underlying.apply(i)
- override def isDefinedAt(x: Int): Boolean = underlying.isDefinedAt(x)
-
- // Seq delegates
- // Seq methods could be printed by cat SeqLike.scala | sed -n '/trait Seq/,$ p' | egrep '^ (override )?def'
-
override def length: Int = underlying.length
- override def lengthCompare(l: Int) = underlying lengthCompare l
+ override def apply(idx: Int): A = underlying.apply(idx)
+ override def lengthCompare(len: Int): Int = underlying.lengthCompare(len)
+ override def isDefinedAt(x: Int): Boolean = underlying.isDefinedAt(x)
override def segmentLength(p: A => Boolean, from: Int): Int = underlying.segmentLength(p, from)
override def prefixLength(p: A => Boolean) = underlying.prefixLength(p)
+ override def indexWhere(p: A => Boolean): Int = underlying.indexWhere(p)
override def indexWhere(p: A => Boolean, from: Int): Int = underlying.indexWhere(p, from)
+ override def findIndexOf(p: A => Boolean): Int = underlying.indexWhere(p)
+ override def indexOf[B >: A](elem: B): Int = underlying.indexOf(elem)
override def indexOf[B >: A](elem: B, from: Int): Int = underlying.indexOf(elem, from)
+ override def lastIndexOf[B >: A](elem: B): Int = underlying.lastIndexOf(elem)
+ override def lastIndexOf[B >: A](elem: B, end: Int): Int = underlying.lastIndexOf(elem, end)
+ override def lastIndexWhere(p: A => Boolean): Int = underlying.lastIndexWhere(p)
+ override def lastIndexWhere(p: A => Boolean, end: Int): Int = underlying.lastIndexWhere(p, end)
override def reverseIterator: Iterator[A] = underlying.reverseIterator
override def startsWith[B](that: Seq[B], offset: Int): Boolean = underlying.startsWith(that, offset)
+ override def startsWith[B](that: Seq[B]): Boolean = underlying.startsWith(that)
override def endsWith[B](that: Seq[B]): Boolean = underlying.endsWith(that)
override def indexOfSlice[B >: A](that: Seq[B]): Int = underlying.indexOfSlice(that)
+ override def indexOfSlice[B >: A](that: Seq[B], from: Int): Int = underlying.indexOfSlice(that, from)
+ override def lastIndexOfSlice[B >: A](that: Seq[B]): Int = underlying.lastIndexOfSlice(that)
+ override def lastIndexOfSlice[B >: A](that: Seq[B], end: Int): Int = underlying.lastIndexOfSlice(that, end)
+ override def containsSlice[B](that: Seq[B]): Boolean = underlying.containsSlice(that)
override def contains(elem: Any): Boolean = underlying.contains(elem)
+ override def corresponds[B](that: Seq[B])(p: (A,B) => Boolean): Boolean = underlying.corresponds(that)(p)
override def indices: Range = underlying.indices
}
diff --git a/src/library/scala/collection/generic/Shrinkable.scala b/src/library/scala/collection/generic/Shrinkable.scala
index bd773a97f9..cf970e1232 100644
--- a/src/library/scala/collection/generic/Shrinkable.scala
+++ b/src/library/scala/collection/generic/Shrinkable.scala
@@ -15,7 +15,6 @@ package generic
* using a `-=` operator.
*
* @author Martin Odersky
- * @owner Martin Odersky
* @version 2.8
* @since 2.8
* @define coll shrinkable collection
@@ -48,14 +47,7 @@ trait Shrinkable[-A] {
* @param iter the iterator producing the elements to remove.
* @return the $coll itself
*/
- def --=(iter: Iterator[A]): this.type = { iter foreach -=; this }
-
- /** Removes all elements contained in a traversable collection from this $coll.
- *
- * @param iter the collection containing the elements to remove.
- * @return the $coll itself
- */
- def --=(iter: Traversable[A]): this.type = { iter foreach -=; this }
+ def --=(xs: TraversableOnce[A]): this.type = { xs foreach -= ; this }
}
diff --git a/src/library/scala/collection/generic/Sorted.scala b/src/library/scala/collection/generic/Sorted.scala
index 73bc1b6553..aa95c76a88 100644
--- a/src/library/scala/collection/generic/Sorted.scala
+++ b/src/library/scala/collection/generic/Sorted.scala
@@ -16,8 +16,8 @@ package generic
* @author Sean McDirmid
* @since 2.8
*/
-trait Sorted[K, +This <: Sorted[K, This]]{
- def ordering : Ordering[K];
+trait Sorted[K, +This <: Sorted[K, This]] {
+ def ordering : Ordering[K]
/** The current collection */
protected def repr: This
@@ -25,7 +25,6 @@ trait Sorted[K, +This <: Sorted[K, This]]{
/** return as a projection the set of keys in this collection */
def keySet: SortedSet[K]
-
/** Returns the first key of the collection. */
def firstKey: K
@@ -68,24 +67,25 @@ trait Sorted[K, +This <: Sorted[K, This]]{
*/
def range(from: K, until: K): This = rangeImpl(Some(from), Some(until))
-
/** Create a range projection of this collection with no lower-bound.
* @param to The upper-bound (inclusive) of the ranged projection.
*/
def to(to: K): This = {
// tough!
- val i = keySet.from(to).iterator;
- if (!i.hasNext) return repr
- val next = i.next;
- if (next == to) {
- if (!i.hasNext) return repr
- else return until(i.next)
- } else return until(next)
+ val i = keySet.from(to).iterator
+ if (i.isEmpty) return repr
+ val next = i.next
+ if (next == to)
+ if (i.isEmpty) repr
+ else until(i.next)
+ else
+ until(next)
}
protected def hasAll(j: Iterator[K]): Boolean = {
- val i = keySet.iterator;
- if (!i.hasNext) return !j.hasNext;
+ val i = keySet.iterator
+ if (i.isEmpty) return j.isEmpty
+
var in = i.next;
while (j.hasNext) {
val jn = j.next;
@@ -99,5 +99,4 @@ trait Sorted[K, +This <: Sorted[K, This]]{
}
true
}
-
}
diff --git a/src/library/scala/collection/generic/Subtractable.scala b/src/library/scala/collection/generic/Subtractable.scala
index 8ded9c22d1..b2051d2773 100644
--- a/src/library/scala/collection/generic/Subtractable.scala
+++ b/src/library/scala/collection/generic/Subtractable.scala
@@ -56,14 +56,5 @@ trait Subtractable[A, +Repr <: Subtractable[A, Repr]] { self =>
* @return a new $coll that contains all elements of the current $coll
* except one less occurrence of each of the elements of `elems`.
*/
- def --(elems: Traversable[A]): Repr = (repr /: elems) (_ - _)
-
- /** Creates a new $coll from this $coll by removing all elements produced
- * by an iterator.
- *
- * @param iter the iterator producing the removed elements.
- * @return a new $coll that contains all elements of the current $coll
- * except one less occurrence of each of the elements produced by `iter`.
- */
- def --(iter: Iterator[A]): Repr = (repr /: iter) (_ - _)
+ def --(xs: TraversableOnce[A]): Repr = (repr /: xs) (_ - _)
}
diff --git a/src/library/scala/collection/generic/TraversableFactory.scala b/src/library/scala/collection/generic/TraversableFactory.scala
index 4f2eb40a64..c2668b48a2 100644
--- a/src/library/scala/collection/generic/TraversableFactory.scala
+++ b/src/library/scala/collection/generic/TraversableFactory.scala
@@ -38,7 +38,7 @@ abstract class TraversableFactory[CC[X] <: Traversable[X] with GenericTraversabl
extends GenericCompanion[CC] {
/** A generic implementation of the `CanBuildFrom` trait, which forwards
- * all calls to `apply(from)` to the `genericBuilder` methof of
+ * all calls to `apply(from)` to the `genericBuilder` method of
* $coll `from`, and which forwards all calls of `apply()` to the
* `newBuilder` method of this factory.
*/
diff --git a/src/library/scala/collection/generic/TraversableForwarder.scala b/src/library/scala/collection/generic/TraversableForwarder.scala
index bd7f751288..dcba86f3d7 100644
--- a/src/library/scala/collection/generic/TraversableForwarder.scala
+++ b/src/library/scala/collection/generic/TraversableForwarder.scala
@@ -42,33 +42,47 @@ trait TraversableForwarder[+A] extends Traversable[A] {
/** The iterable object to which calls are forwarded */
protected def underlying: Traversable[A]
- // Iterable delegates
- // Iterable methods could be printed by cat TarversableLike.scala | sed -n '/trait Iterable/,$ p' | egrep '^ (override )?def'
-
- override def isEmpty = underlying.isEmpty
- override def nonEmpty = underlying.nonEmpty
+ override def foreach[B](f: A => B): Unit = underlying.foreach(f)
+ override def isEmpty: Boolean = underlying.isEmpty
+ override def nonEmpty: Boolean = underlying.nonEmpty
+ override def size: Int = underlying.size
override def hasDefiniteSize = underlying.hasDefiniteSize
- override def foreach[B](f: A => B) = underlying.foreach(f)
override def forall(p: A => Boolean): Boolean = underlying.forall(p)
override def exists(p: A => Boolean): Boolean = underlying.exists(p)
override def count(p: A => Boolean): Int = underlying.count(p)
override def find(p: A => Boolean): Option[A] = underlying.find(p)
override def foldLeft[B](z: B)(op: (B, A) => B): B = underlying.foldLeft(z)(op)
+ override def /: [B](z: B)(op: (B, A) => B): B = underlying./:(z)(op)
override def foldRight[B](z: B)(op: (A, B) => B): B = underlying.foldRight(z)(op)
+ override def :\ [B](z: B)(op: (A, B) => B): B = underlying.:\(z)(op)
override def reduceLeft[B >: A](op: (B, A) => B): B = underlying.reduceLeft(op)
- override def reduceRight[B >: A](op: (A, B) => B): B = underlying.reduceRight(op)
override def reduceLeftOption[B >: A](op: (B, A) => B): Option[B] = underlying.reduceLeftOption(op)
+ override def reduceRight[B >: A](op: (A, B) => B): B = underlying.reduceRight(op)
override def reduceRightOption[B >: A](op: (A, B) => B): Option[B] = underlying.reduceRightOption(op)
+ override def sum[B >: A](implicit num: Numeric[B]): B = underlying.sum(num)
+ override def product[B >: A](implicit num: Numeric[B]): B = underlying.product(num)
+ override def min[B >: A](implicit cmp: Ordering[B]): A = underlying.min(cmp)
+ override def max[B >: A](implicit cmp: Ordering[B]): A = underlying.max(cmp)
+ override def head: A = underlying.head
+ override def headOption: Option[A] = underlying.headOption
+ override def last: A = underlying.last
+ override def lastOption: Option[A] = underlying.lastOption
override def copyToBuffer[B >: A](dest: Buffer[B]) = underlying.copyToBuffer(dest)
override def copyToArray[B >: A](xs: Array[B], start: Int, len: Int) = underlying.copyToArray(xs, start, len)
- override def toArray[B >: A : ClassManifest]: Array[B] = underlying.toArray
+ override def copyToArray[B >: A](xs: Array[B], start: Int) = underlying.copyToArray(xs, start)
+ override def copyToArray[B >: A](xs: Array[B]) = underlying.copyToArray(xs)
+ override def toArray[B >: A: ClassManifest]: Array[B] = underlying.toArray
override def toList: List[A] = underlying.toList
+ override def toIterable: Iterable[A] = underlying.toIterable
override def toSeq: Seq[A] = underlying.toSeq
+ override def toIndexedSeq[B >: A]: mutable.IndexedSeq[B] = underlying.toIndexedSeq
override def toStream: Stream[A] = underlying.toStream
+ override def toSet[B >: A]: immutable.Set[B] = underlying.toSet
+ override def toMap[T, U](implicit ev: A <:< (T, U)): immutable.Map[T, U] = underlying.toMap(ev)
override def mkString(start: String, sep: String, end: String): String = underlying.mkString(start, sep, end)
+ override def mkString(sep: String): String = underlying.mkString(sep)
+ override def mkString: String = underlying.mkString
override def addString(b: StringBuilder, start: String, sep: String, end: String): StringBuilder = underlying.addString(b, start, sep, end)
-
- override def head: A = underlying.head
- override def last: A = underlying.last
- override def lastOption: Option[A] = underlying.lastOption
+ override def addString(b: StringBuilder, sep: String): StringBuilder = underlying.addString(b, sep)
+ override def addString(b: StringBuilder): StringBuilder = underlying.addString(b)
}
diff --git a/src/library/scala/collection/generic/TraversableView.scala.1 b/src/library/scala/collection/generic/TraversableView.scala.1
deleted file mode 100644
index 3608de42be..0000000000
--- a/src/library/scala/collection/generic/TraversableView.scala.1
+++ /dev/null
@@ -1,152 +0,0 @@
-/* __ *\
-** ________ ___ / / ___ Scala API **
-** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
-** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
-** /____/\___/_/ |_/____/_/ | | **
-** |/ **
-\* */
-package scalay.collection.generic
-
-import Math.MAX_INT
-import TraversableView.NoBuilder
-
-/** <p>
- * A base class for views of <code>Traversable</code>.
- * </p>
- * <p>
- * Every subclass has to implement the <code>foreach</code> method.
- * </p>
- *
- * @since 2.8
- */
-abstract class TraversableView[+A, +Coll <: Traversable[_]] extends Traversable[A] {
-self =>
-
- type This >: this.type <: TraversableView[A, Coll] { type This = self.This }
- protected val thisCollection: This = this
-
- protected[this] def newBuilder: Builder[A, This, This] =
- throw new UnsupportedOperationException(this+".newBuilder")
-
- def force[B >: A, That](implicit b: Builder[B, That, Coll]) = {
- b ++= this
- b.result()
- }
-
- trait Transformed[+B] extends TraversableView[B, Coll]
-
- /** pre: from >= 0
- */
- trait Sliced extends Transformed[A] {
- protected[this] val from: Int
- protected[this] val until: Int
- override def foreach(f: A => Unit) {
- var index = 0
- for (x <- self) {
- if (from <= index) {
- if (until <= index) return
- f(x)
- }
- index += 1
- }
- }
- override def stringPrefix = self.stringPrefix+"S"
- override def slice(from1: Int, until1: Int) =
- newSliced(from + (from1 max 0), from + (until1 max 0)).asInstanceOf[This]
- }
-
- trait Mapped[B] extends Transformed[B] {
- protected[this] val mapping: A => B
- override def foreach(f: B => Unit) {
- for (x <- self)
- f(mapping(x))
- }
- override def stringPrefix = self.stringPrefix+"M"
- }
-
- trait FlatMapped[B] extends Transformed[B] {
- protected[this] val mapping: A => Traversable[B]
- override def foreach(f: B => Unit) {
- for (x <- self)
- for (y <- mapping(x))
- f(y)
- }
- override def stringPrefix = self.stringPrefix+"N"
- }
-
- trait Appended[B >: A] extends Transformed[B] {
- protected[this] val rest: Traversable[B]
- override def foreach(f: B => Unit) {
- for (x <- self) f(x)
- for (x <- rest) f(x)
- }
- override def stringPrefix = self.stringPrefix+"A"
- }
-
- trait Filtered extends Transformed[A] {
- protected[this] val pred: A => Boolean
- override def foreach(f: A => Unit) {
- for (x <- self)
- if (pred(x)) f(x)
- }
- override def stringPrefix = self.stringPrefix+"F"
- }
-
- trait TakenWhile extends Transformed[A] {
- protected[this] val pred: A => Boolean
- override def foreach(f: A => Unit) {
- for (x <- self) {
- if (!pred(x)) return
- f(x)
- }
- }
- override def stringPrefix = self.stringPrefix+"T"
- }
-
- trait DroppedWhile extends Transformed[A] {
- protected[this] val pred: A => Boolean
- override def foreach(f: A => Unit) {
- var go = false
- for (x <- self) {
- if (!go && !pred(x)) go = true
- if (go) f(x)
- }
- }
- override def stringPrefix = self.stringPrefix+"D"
- }
-
- override def ++[B >: A, That](that: Traversable[B])(implicit b: Builder[B, That, This]): That =
- if (b.isInstanceOf[NoBuilder[_]]) newAppended(that).asInstanceOf[That]
- else super.++[B, That](that)(b)
-
- override def ++[B >: A, That](that: Iterator[B])(implicit b: Builder[B, That, This]): That = ++[B, That](that.toStream)
-
- override def map[B, That](f: A => B)(implicit b: Builder[B, That, This]): That =
- if (b.isInstanceOf[NoBuilder[_]]) newMapped(f).asInstanceOf[That]
- else super.map[B, That](f)(b)
-
- override def flatMap[B, That](f: A => Traversable[B])(implicit b: Builder[B, That, This]): That =
- if (b.isInstanceOf[NoBuilder[_]]) newFlatMapped(f).asInstanceOf[That]
- else super.flatMap[B, That](f)(b)
-
- override def filter(p: A => Boolean): This = newFiltered(p).asInstanceOf[This]
- override def init: This = newSliced(0, size - 1).asInstanceOf[This]
- override def drop(n: Int): This = newSliced(n max 0, MAX_INT).asInstanceOf[This]
- override def take(n: Int): This = newSliced(0, n).asInstanceOf[This]
- override def slice(from: Int, until: Int): This = newSliced(from max 0, until).asInstanceOf[This]
- override def dropWhile(p: A => Boolean): This = newDroppedWhile(p).asInstanceOf[This]
- override def takeWhile(p: A => Boolean): This = newTakenWhile(p).asInstanceOf[This]
- override def span(p: A => Boolean): (This, This) = (takeWhile(p), dropWhile(p))
- override def splitAt(n: Int): (This, This) = (take(n), drop(n))
-}
-
-object TraversableView {
- class NoBuilder[A] extends Builder[A, Nothing, TraversableView[_, _]] {
- def +=(elem: A) {}
- def iterator: Iterator[A] = Iterator.empty
- @deprecated("use `iterator' instead") def elements = iterator
- def result() = throw new UnsupportedOperationException("TraversableView.Builder.result")
- def clear() {}
- }
- implicit def implicitBuilder[A]: Builder[A, TraversableView[A, Traversable[_]], TraversableView[_, _]] = new NoBuilder
-}
diff --git a/src/library/scala/collection/immutable/DefaultMap.scala b/src/library/scala/collection/immutable/DefaultMap.scala
new file mode 100755
index 0000000000..7ee8197150
--- /dev/null
+++ b/src/library/scala/collection/immutable/DefaultMap.scala
@@ -0,0 +1,53 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+// $Id: DefaultMap.scala 20028 2009-12-07 11:49:19Z cunei $
+
+
+package scala.collection
+package immutable
+
+import generic._
+
+/** <p>
+ * A default map which implements the <code>updated</code> and <code>-</code>
+ * methods of maps.<br/>
+ * Instances that inherit from <code>DefaultMap[A, B]</code> still have to
+ * define:
+ * </p><pre>
+ * <b>def</b> get(key: A): Option[B]
+ * <b>def</b> iterator: Iterator[(A, B)]</pre>
+ * <p>
+ * It refers back to the original map.
+ * </p>
+ * <p>
+ * It might also be advisable to override <code>foreach</code> or
+ * <code>size</code> if efficient implementations can be found.
+ * </p>
+ *
+ * @since 2.8
+ */
+trait DefaultMap[A, +B] extends Map[A, B] { self =>
+
+ /** A default implementation which creates a new immutable map.
+ */
+ override def +[B1 >: B](kv: (A, B1)): Map[A, B1] = {
+ val b = Map.newBuilder[A, B1]
+ b ++= this
+ b += ((kv._1, kv._2))
+ b.result
+ }
+
+ /** A default implementation which creates a new immutable map.
+ */
+ override def - (key: A): Map[A, B] = {
+ val b = newBuilder
+ b ++= this filter (key !=)
+ b.result
+ }
+}
diff --git a/src/library/scala/collection/immutable/HashMap.scala b/src/library/scala/collection/immutable/HashMap.scala
index 2215e22f71..e0f801546c 100644
--- a/src/library/scala/collection/immutable/HashMap.scala
+++ b/src/library/scala/collection/immutable/HashMap.scala
@@ -16,184 +16,383 @@ import generic._
import annotation.unchecked.uncheckedVariance
/** <p>
- * This class implements immutable maps using a hash table.
- * </p>
- * <p>
- * It is optimized for sequential accesses where the last updated table is
- * accessed most often. It supports with reasonable efficiency accesses to
- * previous versions of the table by keeping a change log that's regularly
- * compacted. It needs to synchronize most methods, so it is less suitable
- * for highly concurrent accesses.
+ * This class implements immutable maps using a hash trie.
* </p>
*
* @note the builder of a hash map returns specialized representations EmptyMap,Map1,..., Map4
* for maps of size <= 4.
*
* @author Martin Odersky
- * @version 2.0, 19/01/2007
+ * @author Tiark Rompf
+ * @version 2.8
* @since 2.3
*/
-@serializable @SerialVersionUID(1L)
-class HashMap[A, +B] extends Map[A,B] with MapLike[A, B, HashMap[A, B]] with mutable.HashTable[A] {
+@serializable @SerialVersionUID(2L)
+class HashMap[A, +B] extends Map[A,B] with MapLike[A, B, HashMap[A, B]] {
- type Entry = scala.collection.mutable.DefaultEntry[A, Any]
-
- @transient protected var later: HashMap[A, B @uncheckedVariance] = null
- @transient protected var oldKey: A = _
- @transient protected var oldValue: Option[B @uncheckedVariance] = _
- @transient protected var deltaSize: Int = _
+ override def size: Int = 0
override def empty = HashMap.empty[A, B]
- def get(key: A): Option[B] = synchronized {
- var m: HashMap[A, _ >: B] = this
- var cnt = 0
- while (m.later != null) {
- if (key == m.oldKey) return m.oldValue.asInstanceOf[Option[B]]
- cnt += 1
- m = m.later
- }
- if (cnt > logLimit) makeCopy(m)
- val e = m.findEntry(key)
- if (e == null) None
- else Some(getValue(e))
- }
+ def iterator: Iterator[(A,B)] = Iterator.empty
- override def updated [B1 >: B] (key: A, value: B1): HashMap[A, B1] = synchronized {
- makeCopyIfUpdated()
- val e = findEntry(key)
- if (e == null) {
- markUpdated(key, None, 1)
- later.addEntry(new Entry(key, value))
- } else {
- markUpdated(key, Some(getValue(e)), 0)
- e.value = value
- }
- later.asInstanceOf[HashMap[A, B1]]
- }
+ override def foreach[U](f: ((A, B)) => U): Unit = { }
+
+ def get(key: A): Option[B] =
+ get0(key, computeHash(key), 0)
+
+ override def updated [B1 >: B] (key: A, value: B1): HashMap[A, B1] =
+ updated0(key, computeHash(key), 0, value, null)
+
+ override def + [B1 >: B] (kv: (A, B1)): HashMap[A, B1] =
+ updated0(kv._1, computeHash(kv._1), 0, kv._2, kv)
- /** Add a key/value pair to this map.
- * @param kv the key/value pair
- * @return A new map with the new binding added to this map
- */
- override def + [B1 >: B] (kv: (A, B1)): HashMap[A, B1] = updated(kv._1, kv._2)
-
- /** Adds two or more elements to this collection and returns
- * either the collection itself (if it is mutable), or a new collection
- * with the added elements.
- *
- * @param elem1 the first element to add.
- * @param elem2 the second element to add.
- * @param elems the remaining elements to add.
- */
override def + [B1 >: B] (elem1: (A, B1), elem2: (A, B1), elems: (A, B1) *): HashMap[A, B1] =
this + elem1 + elem2 ++ elems
+ // TODO: optimize (might be able to use mutable updates)
- def - (key: A): HashMap[A, B] = synchronized {
- makeCopyIfUpdated()
- val e = findEntry(key)
- if (e == null) this
- else {
- markUpdated(key, Some(getValue(e)), -1)
- later removeEntry key
- later.asInstanceOf[HashMap[A, B]]
- }
- }
+ def - (key: A): HashMap[A, B] =
+ removed0(key, computeHash(key), 0)
- override def size: Int = synchronized {
- var m: HashMap[A, _ >: B] = this
- var cnt = 0
- var s = 0
- while (m.later != null) {
- s -= m.deltaSize
- cnt += 1
- m = m.later
- }
- s += m.tableSize
- if (cnt > logLimit) makeCopy(m)
- s
- }
+ protected def elemHashCode(key: A) = if (key == null) 0 else key.hashCode()
- def iterator = synchronized {
- makeCopyIfUpdated()
- entriesIterator map {e => (e.key, getValue(e))}
+ protected final def improve(hcode: Int) = {
+ var h: Int = hcode + ~(hcode << 9)
+ h = h ^ (h >>> 14)
+ h = h + (h << 4)
+ h ^ (h >>> 10)
}
- private def getValue(e: Entry) =
- e.value.asInstanceOf[B]
-
- private def logLimit: Int = math.sqrt(table.length).toInt
-
- private[this] def markUpdated(key: A, ov: Option[B], delta: Int) {
- val lf = loadFactor
- later = new HashMap[A, B] {
- override def initialSize = 0
- /* We need to do this to avoid a reference to the outer HashMap */
- def _newLoadFactor = lf
- override def loadFactor = _newLoadFactor
- table = HashMap.this.table
- tableSize = HashMap.this.tableSize
- threshold = HashMap.this.threshold
- }
- oldKey = key
- oldValue = ov
- deltaSize = delta
- }
+ protected def computeHash(key: A) = improve(elemHashCode(key))
- private def makeCopy(last: HashMap[A, _ >: B]) {
- def undo(m: HashMap[A, _ >: B]) {
- if (m ne last) {
- undo(m.later)
- if (m.deltaSize == 1) removeEntry(m.oldKey)
- else if (m.deltaSize == 0) findEntry(m.oldKey).value = m.oldValue.get
- else if (m.deltaSize == -1) addEntry(new Entry(m.oldKey, m.oldValue.get))
- }
- }
- def copy(e: Entry): Entry =
- if (e == null) null
- else {
- val rest = copy(e.next)
- val result = new Entry(e.key, e.value)
- result.next = rest
- result
- }
- val ltable = last.table
- val s = ltable.length
- table = new scala.Array[collection.mutable.HashEntry[A, Entry]](s)
- var i = 0
- while (i < s) {
- table(i) = copy(ltable(i).asInstanceOf[Entry])
- i += 1
- }
- tableSize = last.tableSize
- threshold = last.threshold
- undo(this)
- later = null
- }
+ protected def get0(key: A, hash: Int, level: Int): Option[B] = None
- private def makeCopyIfUpdated() {
- var m: HashMap[A, _ >: B] = this
- while (m.later != null) m = m.later
- if (m ne this) makeCopy(m)
- }
+ protected def updated0[B1 >: B](key: A, hash: Int, level: Int, value: B1, kv: (A, B1)): HashMap[A, B1] =
+ new HashMap.HashMap1(key, hash, value, kv)
- private def writeObject(out: java.io.ObjectOutputStream) {
- serializeTo(out, _.value)
- }
- private def readObject(in: java.io.ObjectInputStream) {
- init[B](in, new Entry(_, _))
- }
+
+ protected def removed0(key: A, hash: Int, level: Int): HashMap[A, B] = this
+
}
/** A factory object for immutable HashMaps.
*
* @author Martin Odersky
+ * @author Tiark Rompf
* @version 2.8
* @since 2.3
*/
object HashMap extends ImmutableMapFactory[HashMap] {
implicit def canBuildFrom[A, B]: CanBuildFrom[Coll, (A, B), HashMap[A, B]] = new MapCanBuildFrom[A, B]
- def empty[A, B]: HashMap[A, B] = new HashMap
+ def empty[A, B]: HashMap[A, B] = EmptyHashMap.asInstanceOf[HashMap[A, B]]
+
+ private object EmptyHashMap extends HashMap[Any,Nothing] {
+
+ }
+
+ // TODO: add HashMap2, HashMap3, ...
+
+ class HashMap1[A,+B](private var key: A, private[HashMap] var hash: Int, private var value: (B @uncheckedVariance), private var kv: (A,B @uncheckedVariance)) extends HashMap[A,B] {
+ override def size = 1
+
+ override def get0(key: A, hash: Int, level: Int): Option[B] =
+ if (hash == this.hash && key == this.key) Some(value) else None
+
+ override def updated0[B1 >: B](key: A, hash: Int, level: Int, value: B1, kv: (A, B1)): HashMap[A, B1] =
+ if (hash == this.hash && key == this.key) new HashMap1(key, hash, value, kv)
+ else {
+ if (hash != this.hash) {
+ //new HashTrieMap[A,B1](level+5, this, new HashMap1(key, hash, value, kv))
+ val m = new HashTrieMap[A,B1](0,new Array[HashMap[A,B1]](0),0) // TODO: could save array alloc
+ m.updated0(this.key, this.hash, level, this.value, this.kv).updated0(key, hash, level, value, kv)
+ } else {
+ // 32-bit hash collision (rare, but not impossible)
+ // wrap this in a HashTrieMap if called with level == 0 (otherwise serialization won't work)
+ if (level == 0) {
+ val elems = new Array[HashMap[A,B1]](1)
+ elems(0) = new HashMapCollision1(hash, ListMap.empty.updated(this.key,this.value).updated(key,value))
+ new HashTrieMap[A,B1](1 << ((hash >>> level) & 0x1f), elems, 2)
+ } else {
+ new HashMapCollision1(hash, ListMap.empty.updated(this.key,this.value).updated(key,value))
+ }
+ }
+ }
+
+ override def removed0(key: A, hash: Int, level: Int): HashMap[A, B] =
+ if (hash == this.hash && key == this.key) HashMap.empty[A,B] else this
+
+ override def iterator: Iterator[(A,B)] = Iterator(ensurePair)
+ override def foreach[U](f: ((A, B)) => U): Unit = f(ensurePair)
+ private[HashMap] def ensurePair: (A,B) = if (kv ne null) kv else { kv = (key, value); kv }
+
+ private def writeObject(out: java.io.ObjectOutputStream) {
+ out.writeObject(key)
+ out.writeObject(value)
+ }
+
+ private def readObject(in: java.io.ObjectInputStream) {
+ key = in.readObject().asInstanceOf[A]
+ value = in.readObject().asInstanceOf[B]
+ hash = computeHash(key)
+ }
+
+ }
+
+ private class HashMapCollision1[A,+B](private[HashMap] var hash: Int, var kvs: ListMap[A,B @uncheckedVariance]) extends HashMap[A,B] {
+ override def size = kvs.size
+
+ override def get0(key: A, hash: Int, level: Int): Option[B] =
+ if (hash == this.hash) kvs.get(key) else None
+
+ override def updated0[B1 >: B](key: A, hash: Int, level: Int, value: B1, kv: (A, B1)): HashMap[A, B1] =
+ if (hash == this.hash) new HashMapCollision1(hash, kvs.updated(key, value))
+ else {
+ var m: HashMap[A,B1] = new HashTrieMap[A,B1](0,new Array[HashMap[A,B1]](0),0)
+ // might be able to save some ops here, but it doesn't seem to be worth it
+ for ((k,v) <- kvs)
+ m = m.updated0(k, this.hash, level, v, null)
+ m.updated0(key, hash, level, value, kv)
+ }
+
+ override def removed0(key: A, hash: Int, level: Int): HashMap[A, B] =
+ if (hash == this.hash) {
+ val kvs1 = kvs - key
+ if (!kvs1.isEmpty)
+ new HashMapCollision1(hash, kvs1)
+ else
+ HashMap.empty[A,B]
+ } else this
+
+ override def iterator: Iterator[(A,B)] = kvs.iterator
+ override def foreach[U](f: ((A, B)) => U): Unit = kvs.foreach(f)
+
+ private def writeObject(out: java.io.ObjectOutputStream) {
+ // this cannot work - reading things in might produce different
+ // hash codes and remove the collision. however this is never called
+ // because no references to this class are ever handed out to client code
+ // and HashTrieMap serialization takes care of the situation
+ error("cannot serialize an immutable.HashMap where all items have the same 32-bit hash code")
+ //out.writeObject(kvs)
+ }
+
+ private def readObject(in: java.io.ObjectInputStream) {
+ error("cannot deserialize an immutable.HashMap where all items have the same 32-bit hash code")
+ //kvs = in.readObject().asInstanceOf[ListMap[A,B]]
+ //hash = computeHash(kvs.)
+ }
+
+ }
+
+
+ class HashTrieMap[A,+B](private var bitmap: Int, private var elems: Array[HashMap[A,B @uncheckedVariance]],
+ private var size0: Int) extends HashMap[A,B] {
+/*
+ def this (level: Int, m1: HashMap1[A,B], m2: HashMap1[A,B]) = {
+ this(((m1.hash >>> level) & 0x1f) | ((m2.hash >>> level) & 0x1f), {
+ val idx1 = (m1.hash >>> level) & 0x1f
+ val idx2 = (m2.hash >>> level) & 0x1f
+ assert(idx1 != idx2, m1.hash + "==" + m2.hash + " at level " + level) // TODO
+ val elems = new Array[HashMap[A,B]](2)
+ if (idx1 < idx2) {
+ elems(0) = m1
+ elems(1) = m2
+ } else {
+ elems(0) = m2
+ elems(1) = m1
+ }
+ elems
+ }, 2)
+ }
+*/
+ override def size = size0
+
+ override def get0(key: A, hash: Int, level: Int): Option[B] = {
+ val index = (hash >>> level) & 0x1f
+ val mask = (1 << index)
+ if (bitmap == - 1) {
+ elems(index & 0x1f).get0(key, hash, level + 5)
+ } else if ((bitmap & mask) != 0) {
+ val offset = Integer.bitCount(bitmap & (mask-1))
+ // TODO: might be worth checking if sub is HashTrieMap (-> monomorphic call site)
+ elems(offset).get0(key, hash, level + 5)
+ } else
+ None
+ }
+
+ override def updated0[B1 >: B](key: A, hash: Int, level: Int, value: B1, kv: (A, B1)): HashMap[A, B1] = {
+ val index = (hash >>> level) & 0x1f
+ val mask = (1 << index)
+ val offset = Integer.bitCount(bitmap & (mask-1))
+ if ((bitmap & mask) != 0) {
+ val elemsNew = new Array[HashMap[A,B1]](elems.length)
+ Array.copy(elems, 0, elemsNew, 0, elems.length)
+ val sub = elems(offset)
+ // TODO: might be worth checking if sub is HashTrieMap (-> monomorphic call site)
+ val subNew = sub.updated0(key, hash, level + 5, value, kv)
+ elemsNew(offset) = subNew
+ new HashTrieMap(bitmap, elemsNew, size + (subNew.size - sub.size))
+ } else {
+ val elemsNew = new Array[HashMap[A,B1]](elems.length + 1)
+ Array.copy(elems, 0, elemsNew, 0, offset)
+ elemsNew(offset) = new HashMap1(key, hash, value, kv)
+ Array.copy(elems, offset, elemsNew, offset + 1, elems.length - offset)
+ val bitmapNew = bitmap | mask
+ new HashTrieMap(bitmapNew, elemsNew, size + 1)
+ }
+ }
+
+ override def removed0(key: A, hash: Int, level: Int): HashMap[A, B] = {
+ val index = (hash >>> level) & 0x1f
+ val mask = (1 << index)
+ val offset = Integer.bitCount(bitmap & (mask-1))
+ if (((bitmap >>> index) & 1) == 1) {
+ val elemsNew = new Array[HashMap[A,B]](elems.length)
+ Array.copy(elems, 0, elemsNew, 0, elems.length)
+ val sub = elems(offset)
+ // TODO: might be worth checking if sub is HashTrieMap (-> monomorphic call site)
+ val subNew = sub.removed0(key, hash, level + 5)
+ elemsNew(offset) = subNew
+ // TODO: handle shrinking
+ val sizeNew = size + (subNew.size - sub.size)
+ if (sizeNew > 0)
+ new HashTrieMap(bitmap, elemsNew, size + (subNew.size - sub.size))
+ else
+ HashMap.empty[A,B]
+ } else {
+ this
+ }
+ }
+
+/*
+ override def iterator = { // TODO: optimize (use a stack to keep track of pos)
+
+ def iter(m: HashTrieMap[A,B], k: => Stream[(A,B)]): Stream[(A,B)] = {
+ def horiz(elems: Array[HashMap[A,B]], i: Int, k: => Stream[(A,B)]): Stream[(A,B)] = {
+ if (i < elems.length) {
+ elems(i) match {
+ case m: HashTrieMap[A,B] => iter(m, horiz(elems, i+1, k))
+ case m: HashMap1[A,B] => new Stream.Cons(m.ensurePair, horiz(elems, i+1, k))
+ }
+ } else k
+ }
+ horiz(m.elems, 0, k)
+ }
+ iter(this, Stream.empty).iterator
+ }
+*/
+
+
+ override def iterator = new Iterator[(A,B)] {
+ private[this] var depth = 0
+ private[this] var arrayStack = new Array[Array[HashMap[A,B]]](6)
+ private[this] var posStack = new Array[Int](6)
+
+ private[this] var arrayD = elems
+ private[this] var posD = 0
+
+ private[this] var subIter: Iterator[(A,B)] = null // to traverse collision nodes
+
+ def hasNext = (subIter ne null) || depth >= 0
+
+ def next: (A,B) = {
+ if (subIter ne null) {
+ val el = subIter.next
+ if (!subIter.hasNext)
+ subIter = null
+ el
+ } else
+ next0(arrayD, posD)
+ }
+
+ @scala.annotation.tailrec private[this] def next0(elems: Array[HashMap[A,B]], i: Int): (A,B) = {
+ if (i == elems.length-1) { // reached end of level, pop stack
+ depth -= 1
+ if (depth >= 0) {
+ arrayD = arrayStack(depth)
+ posD = posStack(depth)
+ arrayStack(depth) = null
+ } else {
+ arrayD = null
+ posD = 0
+ }
+ } else
+ posD += 1
+
+ elems(i) match {
+ case m: HashTrieMap[A,B] => // push current pos onto stack and descend
+ if (depth >= 0) {
+ arrayStack(depth) = arrayD
+ posStack(depth) = posD
+ }
+ depth += 1
+ arrayD = m.elems
+ posD = 0
+ next0(m.elems, 0)
+ case m: HashMap1[A,B] => m.ensurePair
+ case m =>
+ subIter = m.iterator
+ subIter.next
+ }
+ }
+ }
+
+/*
+
+import collection.immutable._
+def time(block: =>Unit) = { val t0 = System.nanoTime; block; println("elapsed: " + (System.nanoTime - t0)/1000000.0) }
+var mOld = OldHashMap.empty[Int,Int]
+var mNew = HashMap.empty[Int,Int]
+time { for (i <- 0 until 100000) mOld = mOld.updated(i,i) }
+time { for (i <- 0 until 100000) mOld = mOld.updated(i,i) }
+time { for (i <- 0 until 100000) mOld = mOld.updated(i,i) }
+time { for (i <- 0 until 100000) mNew = mNew.updated(i,i) }
+time { for (i <- 0 until 100000) mNew = mNew.updated(i,i) }
+time { for (i <- 0 until 100000) mNew = mNew.updated(i,i) }
+time { mOld.iterator.foreach( p => ()) }
+time { mOld.iterator.foreach( p => ()) }
+time { mOld.iterator.foreach( p => ()) }
+time { mNew.iterator.foreach( p => ()) }
+time { mNew.iterator.foreach( p => ()) }
+time { mNew.iterator.foreach( p => ()) }
+
+*/
+
+
+ override def foreach[U](f: ((A, B)) => U): Unit = {
+ var i = 0;
+ while (i < elems.length) {
+ elems(i).foreach(f)
+ i += 1
+ }
+ }
+
+
+ private def writeObject(out: java.io.ObjectOutputStream) {
+ // no out.defaultWriteObject()
+ out.writeInt(size)
+ foreach { p =>
+ out.writeObject(p._1)
+ out.writeObject(p._2)
+ }
+ }
+
+ private def readObject(in: java.io.ObjectInputStream) {
+ val size = in.readInt
+ var index = 0
+ var m = HashMap.empty[A,B]
+ while (index < size) {
+ // TODO: optimize (use unsafe mutable update)
+ m = m + ((in.readObject.asInstanceOf[A], in.readObject.asInstanceOf[B]))
+ index += 1
+ }
+ var tm = m.asInstanceOf[HashTrieMap[A,B]]
+ bitmap = tm.bitmap
+ elems = tm.elems
+ size0 = tm.size0
+ }
+
+ }
+
}
diff --git a/src/library/scala/collection/immutable/HashSet.scala b/src/library/scala/collection/immutable/HashSet.scala
index 2320187be9..16d4473de1 100644
--- a/src/library/scala/collection/immutable/HashSet.scala
+++ b/src/library/scala/collection/immutable/HashSet.scala
@@ -13,148 +13,353 @@ package scala.collection
package immutable
import generic._
+import annotation.unchecked.uncheckedVariance
/** <p>
- * This class implements immutable sets using a hash table.
- * </p>
- * <p>
- * It is optimized for sequential accesses where the last updated table is
- * accessed most often. It supports with reasonable efficiency accesses to
- * previous versions of the table by keeping a change log that's regularly
- * compacted. It needs to synchronize most methods, so it is less suitable
- * for highly concurrent accesses.
+ * This class implements immutable sets using a hash trie.
* </p>
*
* @note the builder of a hash set returns specialized representations EmptySet,Set1,..., Set4
* for sets of size <= 4.
*
* @author Martin Odersky
+ * @author Tiark Rompf
* @version 2.8
* @since 2.3
*/
-@serializable @SerialVersionUID(1L)
+@serializable @SerialVersionUID(2L)
class HashSet[A] extends Set[A]
with GenericSetTemplate[A, HashSet]
- with SetLike[A, HashSet[A]]
- with mutable.FlatHashTable[A] {
+ with SetLike[A, HashSet[A]] {
override def companion: GenericCompanion[HashSet] = HashSet
- @transient protected var later: HashSet[A] = null
- @transient protected var changedElem: A = _
- @transient protected var deleted: Boolean = _
-
- def contains(elem: A): Boolean = synchronized {
- var m = this
- var cnt = 0
- while (m.later != null) {
- if (elem == m.changedElem) return m.deleted
- cnt += 1
- m = m.later
- }
- if (cnt > logLimit) makeCopy(m)
- m.containsEntry(elem)
- }
+ //class HashSet[A] extends Set[A] with SetLike[A, HashSet[A]] {
- def + (elem: A): HashSet[A] = synchronized {
- makeCopyIfUpdated()
- if (containsEntry(elem)) this
- else {
- markUpdated(elem, false)
- later addEntry elem
- later
- }
- }
+ override def size: Int = 0
- def - (elem: A): HashSet[A] = synchronized {
- makeCopyIfUpdated()
- if (!containsEntry(elem)) this
- else {
- markUpdated(elem, true)
- later removeEntry elem
- later
- }
- }
+ override def empty = HashSet.empty[A]
- override def size: Int = synchronized {
- var m = this
- var cnt = 0
- var s = 0
- while (m.later != null) {
- if (m.deleted) s += 1 else s -= 1
- cnt += 1
- m = m.later
- }
- s += m.tableSize
- if (cnt > logLimit) makeCopy(m)
- s
- }
+ def iterator: Iterator[A] = Iterator.empty
- override def iterator = synchronized {
- makeCopyIfUpdated()
- // note need to cache because (later versions of) set might be mutated while elements are traversed.
- val cached = new mutable.ArrayBuffer() ++= super.iterator
- cached.iterator
- }
+ override def foreach[U](f: A => U): Unit = { }
- private def logLimit: Int = math.sqrt(table.length).toInt
-
- private def markUpdated(elem: A, del: Boolean) {
- val lf = loadFactor
- later = new HashSet[A] {
- override def initialSize = 0
- /* We need to do this to avoid a reference to the outer HashMap */
- def _newLoadFactor = lf
- override def loadFactor = _newLoadFactor
- table = HashSet.this.table
- tableSize = HashSet.this.tableSize
- threshold = HashSet.this.threshold
- }
- changedElem = elem
- deleted = del
- }
+ def contains(e: A): Boolean = get0(e, computeHash(e), 0)
- private def makeCopy(last: HashSet[A]) {
- def undo(m: HashSet[A]) {
- if (m.deleted) addEntry(m.changedElem)
- else removeEntry(m.changedElem)
- }
- table = new scala.Array[AnyRef](last.table.length)
- scala.Array.copy(last.table, 0, table, 0, table.length)
- tableSize = last.tableSize
- threshold = last.threshold
-
- // we need to work from the end of the list but non-tail-recursion
- // potentially blows the stack, so instead we create a stack on the heap.
- // See ticket #408.
- val toUndo = new mutable.Stack[HashSet[A]]
- toUndo pushAll ((Iterator iterate this)(_.later) takeWhile (_ ne last))
- toUndo foreach undo
- later = null
- }
+ override def + (e: A): HashSet[A] = updated0(e, computeHash(e), 0)
- private def makeCopyIfUpdated() {
- var m = this
- while (m.later != null) m = m.later
- if (m ne this) makeCopy(m)
- }
+ override def + (elem1: A, elem2: A, elems: A*): HashSet[A] =
+ this + elem1 + elem2 ++ elems
+ // TODO: optimize (might be able to use mutable updates)
- private def writeObject(s: java.io.ObjectOutputStream) {
- serializeTo(s)
- }
+ def - (e: A): HashSet[A] =
+ removed0(e, computeHash(e), 0)
+
+ protected def elemHashCode(key: A) = if (key == null) 0 else key.hashCode()
- private def readObject(in: java.io.ObjectInputStream) {
- init(in, x => x)
+ protected final def improve(hcode: Int) = {
+ var h: Int = hcode + ~(hcode << 9)
+ h = h ^ (h >>> 14)
+ h = h + (h << 4)
+ h ^ (h >>> 10)
}
+
+ protected def computeHash(key: A) = improve(elemHashCode(key))
+
+ protected def get0(key: A, hash: Int, level: Int): Boolean = false
+
+ protected def updated0(key: A, hash: Int, level: Int): HashSet[A] =
+ new HashSet.HashSet1(key, hash)
+
+
+
+ protected def removed0(key: A, hash: Int, level: Int): HashSet[A] = this
+
}
+/*
+object HashSet extends SetFactory[HashSet] {
+ implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, HashSet[A]] = setCanBuildFrom[A]
+ override def empty[A]: HashSet[A] = new HashSet
+}
+*/
+
+
/** A factory object for immutable HashSets.
*
* @author Martin Odersky
+ * @author Tiark Rompf
* @version 2.8
* @since 2.3
*/
object HashSet extends SetFactory[HashSet] {
implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, HashSet[A]] = setCanBuildFrom[A]
- override def empty[A]: HashSet[A] = new HashSet
+ override def empty[A]: HashSet[A] = EmptyHashSet.asInstanceOf[HashSet[A]]
+
+ private object EmptyHashSet extends HashSet[Any] {
+ }
+
+ // TODO: add HashSet2, HashSet3, ...
+
+ class HashSet1[A](private[HashSet] var key: A, private[HashSet] var hash: Int) extends HashSet[A] {
+ override def size = 1
+
+ override def get0(key: A, hash: Int, level: Int): Boolean =
+ (hash == this.hash && key == this.key)
+
+ override def updated0(key: A, hash: Int, level: Int): HashSet[A] =
+ if (hash == this.hash && key == this.key) this
+ else {
+ if (hash != this.hash) {
+ //new HashTrieSet[A](level+5, this, new HashSet1(key, hash))
+ val m = new HashTrieSet[A](0,new Array[HashSet[A]](0),0) // TODO: could save array alloc
+ m.updated0(this.key, this.hash, level).updated0(key, hash, level)
+ } else {
+ // 32-bit hash collision (rare, but not impossible)
+ // wrap this in a HashTrieSet if called with level == 0 (otherwise serialization won't work)
+ if (level == 0) {
+ val elems = new Array[HashSet[A]](1)
+ elems(0) = new HashSetCollision1(hash, ListSet.empty + this.key + key)
+ new HashTrieSet[A](1 << ((hash >>> level) & 0x1f), elems, 2)
+ } else {
+ new HashSetCollision1(hash, ListSet.empty + this.key + key)
+ }
+ }
+ }
+
+ override def removed0(key: A, hash: Int, level: Int): HashSet[A] =
+ if (hash == this.hash && key == this.key) HashSet.empty[A] else this
+
+ override def iterator: Iterator[A] = Iterator(key)
+ override def foreach[U](f: A => U): Unit = f(key)
+
+ private def writeObject(out: java.io.ObjectOutputStream) {
+ out.writeObject(key)
+ }
+
+ private def readObject(in: java.io.ObjectInputStream) {
+ key = in.readObject().asInstanceOf[A]
+ hash = computeHash(key)
+ }
+
+ }
+
+ private class HashSetCollision1[A](private[HashSet] var hash: Int, var ks: ListSet[A]) extends HashSet[A] {
+ override def size = ks.size
+
+ override def get0(key: A, hash: Int, level: Int): Boolean =
+ if (hash == this.hash) ks.contains(key) else false
+
+ override def updated0(key: A, hash: Int, level: Int): HashSet[A] =
+ if (hash == this.hash) new HashSetCollision1(hash, ks + key)
+ else {
+ var m: HashSet[A] = new HashTrieSet[A](0,new Array[HashSet[A]](0),0)
+ // might be able to save some ops here, but it doesn't seem to be worth it
+ for (k <- ks)
+ m = m.updated0(k, this.hash, level)
+ m.updated0(key, hash, level)
+ }
+
+ override def removed0(key: A, hash: Int, level: Int): HashSet[A] =
+ if (hash == this.hash) {
+ val ks1 = ks - key
+ if (!ks1.isEmpty)
+ new HashSetCollision1(hash, ks1)
+ else
+ HashSet.empty[A]
+ } else this
+
+ override def iterator: Iterator[A] = ks.iterator
+ override def foreach[U](f: A => U): Unit = ks.foreach(f)
+
+ private def writeObject(out: java.io.ObjectOutputStream) {
+ // this cannot work - reading things in might produce different
+ // hash codes and remove the collision. however this is never called
+ // because no references to this class are ever handed out to client code
+ // and HashTrieSet serialization takes care of the situation
+ error("cannot serialize an immutable.HashSet where all items have the same 32-bit hash code")
+ //out.writeObject(kvs)
+ }
+
+ private def readObject(in: java.io.ObjectInputStream) {
+ error("cannot deserialize an immutable.HashSet where all items have the same 32-bit hash code")
+ //kvs = in.readObject().asInstanceOf[ListSet[A]]
+ //hash = computeHash(kvs.)
+ }
+
+ }
+
+
+ class HashTrieSet[A](private var bitmap: Int, private var elems: Array[HashSet[A]],
+ private var size0: Int) extends HashSet[A] {
+
+ override def size = size0
+
+ override def get0(key: A, hash: Int, level: Int): Boolean = {
+ val index = (hash >>> level) & 0x1f
+ val mask = (1 << index)
+ if (bitmap == - 1) {
+ elems(index & 0x1f).get0(key, hash, level + 5)
+ } else if ((bitmap & mask) != 0) {
+ val offset = Integer.bitCount(bitmap & (mask-1))
+ // TODO: might be worth checking if sub is HashTrieSet (-> monomorphic call site)
+ elems(offset).get0(key, hash, level + 5)
+ } else
+ false
+ }
+
+ override def updated0(key: A, hash: Int, level: Int): HashSet[A] = {
+ val index = (hash >>> level) & 0x1f
+ val mask = (1 << index)
+ val offset = Integer.bitCount(bitmap & (mask-1))
+ if ((bitmap & mask) != 0) {
+ val elemsNew = new Array[HashSet[A]](elems.length)
+ Array.copy(elems, 0, elemsNew, 0, elems.length)
+ val sub = elems(offset)
+ // TODO: might be worth checking if sub is HashTrieSet (-> monomorphic call site)
+ val subNew = sub.updated0(key, hash, level + 5)
+ elemsNew(offset) = subNew
+ new HashTrieSet(bitmap, elemsNew, size + (subNew.size - sub.size))
+ } else {
+ val elemsNew = new Array[HashSet[A]](elems.length + 1)
+ Array.copy(elems, 0, elemsNew, 0, offset)
+ elemsNew(offset) = new HashSet1(key, hash)
+ Array.copy(elems, offset, elemsNew, offset + 1, elems.length - offset)
+ val bitmapNew = bitmap | mask
+ new HashTrieSet(bitmapNew, elemsNew, size + 1)
+ }
+ }
+
+ override def removed0(key: A, hash: Int, level: Int): HashSet[A] = {
+ val index = (hash >>> level) & 0x1f
+ val mask = (1 << index)
+ val offset = Integer.bitCount(bitmap & (mask-1))
+ if (((bitmap >>> index) & 1) == 1) {
+ val elemsNew = new Array[HashSet[A]](elems.length)
+ Array.copy(elems, 0, elemsNew, 0, elems.length)
+ val sub = elems(offset)
+ // TODO: might be worth checking if sub is HashTrieSet (-> monomorphic call site)
+ val subNew = sub.removed0(key, hash, level + 5)
+ elemsNew(offset) = subNew
+ // TODO: handle shrinking
+ val sizeNew = size + (subNew.size - sub.size)
+ if (sizeNew > 0)
+ new HashTrieSet(bitmap, elemsNew, size + (subNew.size - sub.size))
+ else
+ HashSet.empty[A]
+ } else {
+ this
+ }
+ }
+
+ override def iterator = new Iterator[A] {
+ private[this] var depth = 0
+ private[this] var arrayStack = new Array[Array[HashSet[A]]](6)
+ private[this] var posStack = new Array[Int](6)
+
+ private[this] var arrayD = elems
+ private[this] var posD = 0
+
+ private[this] var subIter: Iterator[A] = null // to traverse collision nodes
+
+ def hasNext = (subIter ne null) || depth >= 0
+
+ def next: A = {
+ if (subIter ne null) {
+ val el = subIter.next
+ if (!subIter.hasNext)
+ subIter = null
+ el
+ } else
+ next0(arrayD, posD)
+ }
+
+ @scala.annotation.tailrec private[this] def next0(elems: Array[HashSet[A]], i: Int): A = {
+ if (i == elems.length-1) { // reached end of level, pop stack
+ depth -= 1
+ if (depth >= 0) {
+ arrayD = arrayStack(depth)
+ posD = posStack(depth)
+ arrayStack(depth) = null
+ } else {
+ arrayD = null
+ posD = 0
+ }
+ } else
+ posD += 1
+
+ elems(i) match {
+ case m: HashTrieSet[A] => // push current pos onto stack and descend
+ if (depth >= 0) {
+ arrayStack(depth) = arrayD
+ posStack(depth) = posD
+ }
+ depth += 1
+ arrayD = m.elems
+ posD = 0
+ next0(m.elems, 0)
+ case m: HashSet1[A] => m.key
+ case m =>
+ subIter = m.iterator
+ subIter.next
+ }
+ }
+ }
+
+/*
+
+import collection.immutable._
+def time(block: =>Unit) = { val t0 = System.nanoTime; block; println("elapsed: " + (System.nanoTime - t0)/1000000.0) }
+var mOld = OldHashSet.empty[Int]
+var mNew = HashSet.empty[Int]
+time { for (i <- 0 until 100000) mOld = mOld + i }
+time { for (i <- 0 until 100000) mOld = mOld + i }
+time { for (i <- 0 until 100000) mOld = mOld + i }
+time { for (i <- 0 until 100000) mNew = mNew + i }
+time { for (i <- 0 until 100000) mNew = mNew + i }
+time { for (i <- 0 until 100000) mNew = mNew + i }
+time { mOld.iterator.foreach( p => ()) }
+time { mOld.iterator.foreach( p => ()) }
+time { mOld.iterator.foreach( p => ()) }
+time { mNew.iterator.foreach( p => ()) }
+time { mNew.iterator.foreach( p => ()) }
+time { mNew.iterator.foreach( p => ()) }
+
+*/
+
+
+ override def foreach[U](f: A => U): Unit = {
+ var i = 0;
+ while (i < elems.length) {
+ elems(i).foreach(f)
+ i += 1
+ }
+ }
+
+
+ private def writeObject(out: java.io.ObjectOutputStream) {
+ // no out.defaultWriteObject()
+ out.writeInt(size)
+ foreach { e =>
+ out.writeObject(e)
+ }
+ }
+
+ private def readObject(in: java.io.ObjectInputStream) {
+ val size = in.readInt
+ var index = 0
+ var m = HashSet.empty[A]
+ while (index < size) {
+ // TODO: optimize (use unsafe mutable update)
+ m = m + in.readObject.asInstanceOf[A]
+ index += 1
+ }
+ var tm = m.asInstanceOf[HashTrieSet[A]]
+ bitmap = tm.bitmap
+ elems = tm.elems
+ size0 = tm.size0
+ }
+
+ }
+
}
diff --git a/src/library/scala/collection/immutable/IndexedSeq.scala b/src/library/scala/collection/immutable/IndexedSeq.scala
index 0d7b1b0d23..3f29052808 100644
--- a/src/library/scala/collection/immutable/IndexedSeq.scala
+++ b/src/library/scala/collection/immutable/IndexedSeq.scala
@@ -14,8 +14,9 @@ package immutable
import generic._
import mutable.{ArrayBuffer, Builder}
-/** A subtrait of <code>collection.IndexedSeq</code> which represents sequences
+/** A subtrait of <code>collection.IndexedSeq</code> which represents indexed sequences
* that cannot be mutated.
+ * $indexedSeqInfo
*
* @since 2.8
*/
@@ -36,5 +37,5 @@ object IndexedSeq extends SeqFactory[IndexedSeq] {
def apply(idx: Int) = buf.apply(idx)
}
implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, IndexedSeq[A]] = new GenericCanBuildFrom[A]
- def newBuilder[A]: Builder[A, IndexedSeq[A]] = new ArrayBuffer[A] mapResult (buf => new Impl(buf))
-} \ No newline at end of file
+ def newBuilder[A]: Builder[A, IndexedSeq[A]] = Vector.newBuilder[A]
+}
diff --git a/src/library/scala/collection/immutable/IntMap.scala b/src/library/scala/collection/immutable/IntMap.scala
index 52451e6012..62309a9f48 100644
--- a/src/library/scala/collection/immutable/IntMap.scala
+++ b/src/library/scala/collection/immutable/IntMap.scala
@@ -151,7 +151,10 @@ import IntMap._
* <a href="http://citeseer.ist.psu.edu/okasaki98fast.html">Fast Mergeable Integer Maps</a>
* by Okasaki and Gill. Essentially a trie based on binary digits of the the integers.
*
+ * Note: This class is as of 2.8 largely superseded by HashMap.
+ *
* @since 2.7
+ *
*/
sealed abstract class IntMap[+T] extends Map[Int, T] with MapLike[Int, T, IntMap[T]] {
override def empty: IntMap[T] = IntMap.Nil;
@@ -357,7 +360,7 @@ sealed abstract class IntMap[+T] extends Map[Int, T] with MapLike[Int, T, IntMap
}
/**
- * Forms the intersection of these two maps with a combinining function. The resulting map is
+ * Forms the intersection of these two maps with a combining function. The resulting map is
* a map that has only keys present in both maps and has values produced from the original mappings
* by combining them with f.
*
diff --git a/src/library/scala/collection/immutable/LinearSeq.scala b/src/library/scala/collection/immutable/LinearSeq.scala
index c1efea037c..016afd4508 100644
--- a/src/library/scala/collection/immutable/LinearSeq.scala
+++ b/src/library/scala/collection/immutable/LinearSeq.scala
@@ -17,7 +17,7 @@ import mutable.Builder
/** A subtrait of <code>collection.LinearSeq</code> which represents sequences
* that cannot be mutated.
- *
+ * $linearSeqInfo
* @since 2.8
*/
trait LinearSeq[+A] extends Seq[A]
diff --git a/src/library/scala/collection/immutable/List.scala b/src/library/scala/collection/immutable/List.scala
index 2088f3ac78..2b91ab8852 100644
--- a/src/library/scala/collection/immutable/List.scala
+++ b/src/library/scala/collection/immutable/List.scala
@@ -46,7 +46,7 @@ import annotation.tailrec
sealed abstract class List[+A] extends LinearSeq[A]
with Product
with GenericTraversableTemplate[A, List]
- with LinearSeqLike[A, List[A]] {
+ with LinearSeqOptimized[A, List[A]] {
override def companion: GenericCompanion[List] = List
import scala.collection.{Iterable, Traversable, Seq, IndexedSeq}
@@ -61,7 +61,7 @@ sealed abstract class List[+A] extends LinearSeq[A]
* @param x the element to prepend.
* @return a list which contains `x` as first element and
* which continues with this list.
- * @ex `1 :: List(2, 3) = List(2, 3).::(1) = List(1, 2, 3)`
+ * @example `1 :: List(2, 3) = List(2, 3).::(1) = List(1, 2, 3)`
* @usecase def ::(x: A): List[A]
*/
def ::[B >: A] (x: B): List[B] =
@@ -71,7 +71,7 @@ sealed abstract class List[+A] extends LinearSeq[A]
* @param prefix The list elements to prepend.
* @return a list resulting from the concatenation of the given
* list `prefix` and this list.
- * @ex `List(1, 2) ::: List(3, 4) = List(3, 4).:::(List(1, 2)) = List(1, 2, 3, 4)`
+ * @example `List(1, 2) ::: List(3, 4) = List(3, 4).:::(List(1, 2)) = List(1, 2, 3, 4)`
* @usecase def :::(prefix: List[A]): List[A]
*/
def :::[B >: A](prefix: List[B]): List[B] =
@@ -133,16 +133,18 @@ sealed abstract class List[+A] extends LinearSeq[A]
loop(this)
}
- // Overridden methods from IterableLike or overloaded variants of such methods
+ // Overridden methods from IterableLike and SeqLike or overloaded variants of such methods
- override def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[List[A], B, That]): That = {
+ override def ++[B >: A, That](that: TraversableOnce[B])(implicit bf: CanBuildFrom[List[A], B, That]): That = {
val b = bf(this)
if (b.isInstanceOf[ListBuffer[_]]) (this ::: that.toList).asInstanceOf[That]
else super.++(that)
}
- override def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[List[A], B, That]): That =
- this ++ that.toList
+ override def +:[B >: A, That](elem: B)(implicit bf: CanBuildFrom[List[A], B, That]): That = bf match {
+ case _: List.GenericCanBuildFrom[_] => (elem :: this).asInstanceOf[That]
+ case _ => super.+:(elem)(bf)
+ }
override def toList: List[A] = this
@@ -288,6 +290,9 @@ sealed abstract class List[+A] extends LinearSeq[A]
b.toList
}
+ @deprecated("use `distinct' instead")
+ def removeDuplicates: List[A] = distinct
+
/** <p>
* Sort the list according to the comparison function
* `lt(e1: a, e2: a) =&gt; Boolean`,
@@ -299,7 +304,7 @@ sealed abstract class List[+A] extends LinearSeq[A]
* @param lt the comparison function
* @return a list sorted according to the comparison function
* `lt(e1: a, e2: a) =&gt; Boolean`.
- * @ex <pre>
+ * @example <pre>
* List("Steve", "Tom", "John", "Bob")
* .sort((e1, e2) => (e1 compareTo e2) &lt; 0) =
* List("Bob", "John", "Steve", "Tom")</pre>
@@ -383,7 +388,7 @@ case object Nil extends List[Nothing] {
throw new NoSuchElementException("head of empty list")
override def tail: List[Nothing] =
throw new UnsupportedOperationException("tail of empty list")
- // Removal of equals method here might lead to an infinite recusion similar to IntMap.equals.
+ // Removal of equals method here might lead to an infinite recursion similar to IntMap.equals.
override def equals(that: Any) = that match {
case that1: Seq[_] => that1.isEmpty
case _ => false
@@ -539,7 +544,7 @@ object List extends SeqFactory[List] {
* Returns the `Left` values in the given `Iterable`
* of `Either`s.
*/
- @deprecated("use `xs partialMap { case Left(x: A) => x }' instead of `List.lefts(xs)'")
+ @deprecated("use `xs collect { case Left(x: A) => x }' instead of `List.lefts(xs)'")
def lefts[A, B](es: Iterable[Either[A, B]]) =
es.foldRight[List[A]](Nil)((e, as) => e match {
case Left(a) => a :: as
@@ -549,7 +554,7 @@ object List extends SeqFactory[List] {
/**
* Returns the `Right` values in the given`Iterable` of `Either`s.
*/
- @deprecated("use `xs partialMap { case Right(x: B) => x }' instead of `List.rights(xs)'")
+ @deprecated("use `xs collect { case Right(x: B) => x }' instead of `List.rights(xs)'")
def rights[A, B](es: Iterable[Either[A, B]]) =
es.foldRight[List[B]](Nil)((e, bs) => e match {
case Left(_) => bs
@@ -561,9 +566,9 @@ object List extends SeqFactory[List] {
* @param xs the iterable of Eithers to separate
* @return a pair of lists.
*/
- @deprecated("use `Either.separate' instead")
+ @deprecated("use `(for (Left(x) <- es) yield x, for (Right(x) <- es) yield x)` instead")
def separate[A,B](es: Iterable[Either[A, B]]): (List[A], List[B]) =
- es.foldRight[(List[A], List[B])]((Nil, Nil)) {
+ es.foldRight[(List[A], List[B])]((Nil, Nil)) {
case (Left(a), (lefts, rights)) => (a :: lefts, rights)
case (Right(b), (lefts, rights)) => (lefts, b :: rights)
}
@@ -590,7 +595,7 @@ object List extends SeqFactory[List] {
*
* @param arr the array to convert
* @param start the first index to consider
- * @param len the lenght of the range to convert
+ * @param len the length of the range to convert
* @return a list that contains the same elements than `arr`
* in the same order
*/
diff --git a/src/library/scala/collection/immutable/ListMap.scala b/src/library/scala/collection/immutable/ListMap.scala
index 0f66a1f452..d8e3e0856b 100644
--- a/src/library/scala/collection/immutable/ListMap.scala
+++ b/src/library/scala/collection/immutable/ListMap.scala
@@ -30,7 +30,7 @@ object ListMap extends ImmutableMapFactory[ListMap] {
* directly, or by applying the function <code>ListMap.empty</code>.
*
* @author Matthias Zenger
- * @author Martin Oderskty
+ * @author Martin Odersky
* @version 2.0, 01/01/2007
* @since 1
*/
diff --git a/src/library/scala/collection/immutable/LongMap.scala b/src/library/scala/collection/immutable/LongMap.scala
index e527712475..0d74e41cec 100644
--- a/src/library/scala/collection/immutable/LongMap.scala
+++ b/src/library/scala/collection/immutable/LongMap.scala
@@ -138,6 +138,8 @@ import LongMap._;
* <a href="http://citeseer.ist.psu.edu/okasaki98fast.html">Fast Mergeable Long Maps</a>
* by Okasaki and Gill. Essentially a trie based on binary digits of the the integers.
*
+ * Note: This class is as of 2.8 largely superseded by HashMap.
+ *
* @since 2.7
*/
sealed abstract class LongMap[+T] extends Map[Long, T] with MapLike[Long, T, LongMap[T]] {
@@ -344,7 +346,7 @@ sealed abstract class LongMap[+T] extends Map[Long, T] with MapLike[Long, T, Lon
}
/**
- * Forms the intersection of these two maps with a combinining function. The resulting map is
+ * Forms the intersection of these two maps with a combining function. The resulting map is
* a map that has only keys present in both maps and has values produced from the original mappings
* by combining them with f.
*
diff --git a/src/library/scala/collection/immutable/Map.scala b/src/library/scala/collection/immutable/Map.scala
index f42794d09e..b5a852683a 100644
--- a/src/library/scala/collection/immutable/Map.scala
+++ b/src/library/scala/collection/immutable/Map.scala
@@ -44,7 +44,7 @@ trait Map[A, +B] extends Iterable[(A, B)]
object Map extends ImmutableMapFactory[Map] {
implicit def canBuildFrom[A, B]: CanBuildFrom[Coll, (A, B), Map[A, B]] = new MapCanBuildFrom[A, B]
- def empty[A, B]: Map[A, B] = new EmptyMap[A, B]
+ def empty[A, B]: Map[A, B] = EmptyMap.asInstanceOf[Map[A, B]]
class WithDefault[A, +B](underlying: Map[A, B], d: A => B) extends Map[A, B] {
override def size = underlying.size
@@ -58,12 +58,22 @@ object Map extends ImmutableMapFactory[Map] {
}
@serializable
- class EmptyMap[A, +B] extends Map[A, B] {
+ private object EmptyMap extends Map[Any, Nothing] {
+ override def size: Int = 0
+ def get(key: Any): Option[Nothing] = None
+ def iterator: Iterator[(Any, Nothing)] = Iterator.empty
+ override def updated [B1] (key: Any, value: B1): Map[Any, B1] = new Map1(key, value)
+ def + [B1](kv: (Any, B1)): Map[Any, B1] = updated(kv._1, kv._2)
+ def - (key: Any): Map[Any, Nothing] = this
+ }
+
+ @serializable @deprecated("use `Map.empty' instead")
+ class EmptyMap[A,B] extends Map[A,B] {
override def size: Int = 0
def get(key: A): Option[B] = None
def iterator: Iterator[(A, B)] = Iterator.empty
- override def updated [B1 >: B] (key: A, value: B1): Map[A, B1] = new Map1(key, value)
- def + [B1 >: B](kv: (A, B1)): Map[A, B1] = updated(kv._1, kv._2)
+ override def updated [B1] (key: A, value: B1): Map[A, B1] = new Map1(key, value)
+ def + [B1](kv: (A, B1)): Map[A, B1] = updated(kv._1, kv._2)
def - (key: A): Map[A, B] = this
}
@@ -78,7 +88,7 @@ object Map extends ImmutableMapFactory[Map] {
else new Map2(key1, value1, key, value)
def + [B1 >: B](kv: (A, B1)): Map[A, B1] = updated(kv._1, kv._2)
def - (key: A): Map[A, B] =
- if (key == key1) empty else this
+ if (key == key1) Map.empty else this
override def foreach[U](f: ((A, B)) => U): Unit = {
f((key1, value1))
}
diff --git a/src/library/scala/collection/immutable/MapLike.scala b/src/library/scala/collection/immutable/MapLike.scala
index a06bce1038..662321bb0c 100644
--- a/src/library/scala/collection/immutable/MapLike.scala
+++ b/src/library/scala/collection/immutable/MapLike.scala
@@ -41,8 +41,9 @@ import generic._
* @version 2.8
* @since 2.8
*/
-trait MapLike[A, +B, +This <: MapLike[A, B, This] with Map[A, B]] extends scala.collection.MapLike[A, B, This] {
-self =>
+trait MapLike[A, +B, +This <: MapLike[A, B, This] with Map[A, B]]
+ extends scala.collection.MapLike[A, B, This]
+{ self =>
import scala.collection.Traversable
@@ -74,16 +75,36 @@ self =>
*
* @param elems the traversable object.
*/
- override def ++[B1 >: B](elems: Traversable[(A, B1)]): immutable.Map[A, B1] =
- ((repr: immutable.Map[A, B1]) /: elems) (_ + _)
+ override def ++[B1 >: B](xs: TraversableOnce[(A, B1)]): immutable.Map[A, B1] =
+ ((repr: immutable.Map[A, B1]) /: xs) (_ + _)
- /** Adds a number of elements provided by an iterator
- * and returns a new collection with the added elements.
- *
- * @param iter the iterator
+ /** Filters this map by retaining only keys satisfying a predicate.
+ * @param p the predicate used to test keys
+ * @return an immutable map consisting only of those key value pairs of this map where the key satisfies
+ * the predicate `p`. The resulting map wraps the original map without copying any elements.
*/
- override def ++[B1 >: B] (iter: Iterator[(A, B1)]): immutable.Map[A, B1] =
- ((repr: immutable.Map[A, B1]) /: iter) (_ + _)
+ override def filterKeys(p: A => Boolean): Map[A, B] = new DefaultMap[A, B] {
+ override def foreach[C](f: ((A, B)) => C): Unit = for (kv <- self) if (p(kv._1)) f(kv)
+ def iterator = self.iterator.filter(kv => p(kv._1))
+ override def contains(key: A) = self.contains(key) && p(key)
+ def get(key: A) = if (!p(key)) None else self.get(key)
+ }
+
+ /** Transforms this map by applying a function to every retrieved value.
+ * @param d the function used to transform values of this map.
+ * @return an immutable map which maps every key of this map
+ * to `f(this(key))`. The resulting map wraps the original map without copying any elements.
+ */
+ /** A map view resulting from applying a given function `f` to each value
+ * associated with a key in this map.
+ */
+ override def mapValues[C](f: B => C): Map[A, C] = new DefaultMap[A, C] {
+ override def foreach[D](g: ((A, C)) => D): Unit = for ((k, v) <- self) g((k, f(v)))
+ def iterator = for ((k, v) <- self.iterator) yield (k, f(v))
+ override def size = self.size
+ override def contains(key: A) = self.contains(key)
+ def get(key: A) = self.get(key).map(f)
+ }
/** This function transforms all the values of mappings contained
* in this map with function <code>f</code>.
@@ -97,23 +118,6 @@ self =>
b.result
}
- /** Returns a new map with all key/value pairs for which the predicate
- * <code>p</code> returns <code>true</code>.
- *
- * @param p A predicate over key-value pairs
- * @note This method works by successively removing elements fro which the
- * predicate is false from this set.
- * If removal is slow, or you expect that most elements of the set$
- * will be removed, you might consider using <code>filter</code>
- * with a negated predicate instead.
- */
- override def filterNot(p: ((A, B)) => Boolean): This = {
- var res: This = repr
- for (kv <- this)
- if (p(kv)) res = (res - kv._1).asInstanceOf[This] // !!! concrete overrides abstract problem
- res
- }
-
@deprecated("use `updated' instead")
def update[B1 >: B](key: A, value: B1): immutable.Map[A, B1] = updated(key, value).asInstanceOf[immutable.Map[A, B1]]
}
diff --git a/src/library/scala/collection/immutable/MapProxy.scala b/src/library/scala/collection/immutable/MapProxy.scala
index 0ef7aa620a..371af042e7 100644
--- a/src/library/scala/collection/immutable/MapProxy.scala
+++ b/src/library/scala/collection/immutable/MapProxy.scala
@@ -37,5 +37,9 @@ trait MapProxy[A, +B] extends Map[A, B] with MapProxyLike[A, B, Map[A, B]]
override def + [B1 >: B](kv: (A, B1)): Map[A, B1] = newProxy(self + kv)
override def + [B1 >: B](elem1: (A, B1), elem2: (A, B1), elems: (A, B1) *) =
newProxy(self.+(elem1, elem2, elems: _*))
+
override def -(key: A) = newProxy(self - key)
+
+ override def filterKeys(p: A => Boolean) = self.filterKeys(p)
+ override def mapValues[C](f: B => C) = self.mapValues(f)
}
diff --git a/src/library/scala/collection/immutable/NumericRange.scala b/src/library/scala/collection/immutable/NumericRange.scala
index 5514f7a24d..d3e4558884 100644
--- a/src/library/scala/collection/immutable/NumericRange.scala
+++ b/src/library/scala/collection/immutable/NumericRange.scala
@@ -34,11 +34,18 @@ import generic._
* @version 2.8
*/
@serializable
-abstract class NumericRange[+T]
+abstract class NumericRange[T]
(val start: T, val end: T, val step: T, val isInclusive: Boolean)
(implicit num: Integral[T])
extends IndexedSeq[T]
{
+ /** Note that NumericRange must be invariant so that constructs
+ * such as
+ *
+ * 1L to 10 by 5
+ *
+ * do not infer the range type as AnyVal.
+ */
import num._
private def fail(msg: String) = throw new IllegalArgumentException(msg)
@@ -56,20 +63,18 @@ extends IndexedSeq[T]
// inclusive/exclusiveness captured this way because we do not have any
// concept of a "unit", we can't just add an epsilon to an exclusive
// endpoint to make it inclusive (as can be done with the int-based Range.)
- protected def limitTest[U >: T](x: U)(implicit unum: Integral[U]) =
- !isEmpty && isInclusive && unum.equiv(x, end)
+ protected def limitTest(x: T) = !isEmpty && isInclusive && equiv(x, end)
protected def underlying = collection.immutable.IndexedSeq.empty[T]
/** Create a new range with the start and end values of this range and
* a new <code>step</code>.
*/
- def by[U >: T](newStep: U)(implicit unum: Integral[U]): NumericRange[U] =
- copy(start, end, newStep)
+ def by(newStep: T): NumericRange[T] = copy(start, end, newStep)
/** Create a copy of this range.
*/
- def copy[U >: T](start: U, end: U, step: U)(implicit unum: Integral[U]): NumericRange[U]
+ def copy(start: T, end: T, step: T): NumericRange[T]
override def foreach[U](f: T => U) {
var i = start
@@ -115,9 +120,8 @@ extends IndexedSeq[T]
}
// a well-typed contains method.
- def containsTyped[U >: T](x: U)(implicit unum: Integral[U]): Boolean = {
- import unum._
- def divides(d: U, by: U) = equiv(d % by, zero)
+ def containsTyped(x: T): Boolean = {
+ def divides(d: T, by: T) = equiv(d % by, zero)
limitTest(x) || (
if (step > zero)
@@ -154,7 +158,7 @@ extends IndexedSeq[T]
// XXX This may be incomplete.
new NumericRange[A](fm(start), fm(end), fm(step), isInclusive) {
- def copy[A1 >: A](start: A1, end: A1, step: A1)(implicit unum: Integral[A1]): NumericRange[A1] =
+ def copy(start: A, end: A, step: A): NumericRange[A] =
if (isInclusive) NumericRange.inclusive(start, end, step)
else NumericRange(start, end, step)
@@ -162,8 +166,7 @@ extends IndexedSeq[T]
override def foreach[U](f: A => U) { underlyingRange foreach (x => f(fm(x))) }
override def isEmpty = underlyingRange.isEmpty
override def apply(idx: Int): A = fm(underlyingRange(idx))
- override def containsTyped[A1 >: A](el: A1)(implicit unum: Integral[A1]) =
- underlyingRange exists (x => fm(x) == el)
+ override def containsTyped(el: A) = underlyingRange exists (x => fm(x) == el)
}
}
@@ -200,7 +203,7 @@ extends IndexedSeq[T]
object NumericRange {
class Inclusive[T](start: T, end: T, step: T)(implicit num: Integral[T])
extends NumericRange(start, end, step, true) {
- def copy[U >: T](start: U, end: U, step: U)(implicit unum: Integral[U]): Inclusive[U] =
+ def copy(start: T, end: T, step: T): Inclusive[T] =
NumericRange.inclusive(start, end, step)
def exclusive: Exclusive[T] = NumericRange(start, end, step)
@@ -208,7 +211,7 @@ object NumericRange {
class Exclusive[T](start: T, end: T, step: T)(implicit num: Integral[T])
extends NumericRange(start, end, step, false) {
- def copy[U >: T](start: U, end: U, step: U)(implicit unum: Integral[U]): Exclusive[U] =
+ def copy(start: T, end: T, step: T): Exclusive[T] =
NumericRange(start, end, step)
def inclusive: Inclusive[T] = NumericRange.inclusive(start, end, step)
diff --git a/src/library/scala/collection/immutable/PagedSeq.scala b/src/library/scala/collection/immutable/PagedSeq.scala
index bde8d67ffe..bd12502520 100644
--- a/src/library/scala/collection/immutable/PagedSeq.scala
+++ b/src/library/scala/collection/immutable/PagedSeq.scala
@@ -202,7 +202,7 @@ private class Page[T: ClassManifest](val num: Int) {
/** The next page in the sequence */
var next : Page[T] = null
- /** A later page in the sequence, serves a cachae for pointing to last page */
+ /** A later page in the sequence, serves a cache for pointing to last page */
var later : Page[T] = this
/** The number of characters read into this page */
@@ -218,11 +218,11 @@ private class Page[T: ClassManifest](val num: Int) {
/** The index of the first character in this page relative to the whole sequence */
final def start = num * PageSize
- /** The index of the character following the last charcater in this page relative
+ /** The index of the character following the last character in this page relative
* to the whole sequence */
final def end = start + filled
- /** The currently last page in the sequence; might change as more charcaters are appended */
+ /** The currently last page in the sequence; might change as more characters are appended */
final def latest: Page[T] = {
if (later.next != null) later = later.next.latest
later
diff --git a/src/library/scala/collection/immutable/Queue.scala b/src/library/scala/collection/immutable/Queue.scala
index 9957f90ab3..02d344ceea 100644
--- a/src/library/scala/collection/immutable/Queue.scala
+++ b/src/library/scala/collection/immutable/Queue.scala
@@ -12,12 +12,8 @@
package scala.collection
package immutable
-import scala.annotation.tailrec
-
-object Queue {
- val Empty: Queue[Nothing] = new Queue(Nil, Nil)
- def apply[A](elems: A*) = new Queue(Nil, elems.toList)
-}
+import generic._
+import mutable.{ Builder, ListBuffer }
/** <code>Queue</code> objects implement data structures that allow to
* insert and retrieve elements in a first-in-first-out (FIFO) manner.
@@ -28,10 +24,13 @@ object Queue {
*/
@serializable
@SerialVersionUID(-7622936493364270175L)
-class Queue[+A] protected(
- protected val in: List[A],
- protected val out: List[A]) extends Seq[A]
-{
+class Queue[+A] protected(protected val in: List[A], protected val out: List[A])
+ extends Seq[A]
+ with GenericTraversableTemplate[A, Queue]
+ with SeqLike[A, Queue[A]] {
+
+ override def companion: GenericCompanion[Queue] = Queue
+
/** Returns the <code>n</code>-th element of this queue.
* The first element is at position 0.
*
@@ -127,3 +126,13 @@ class Queue[+A] protected(
*/
override def toString() = mkString("Queue(", ", ", ")")
}
+
+object Queue extends SeqFactory[Queue] {
+ implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, Queue[A]] = new GenericCanBuildFrom[A]
+ def newBuilder[A]: Builder[A, Queue[A]] = new ListBuffer[A] mapResult (x => new Queue[A](Nil, x.toList))
+ override def empty[A]: Queue[A] = new Queue[A](Nil, Nil)
+ override def apply[A](xs: A*): Queue[A] = new Queue[A](Nil, xs.toList)
+
+ @deprecated("Use Queue.empty instead")
+ val Empty: Queue[Nothing] = Queue()
+}
diff --git a/src/library/scala/collection/immutable/Range.scala b/src/library/scala/collection/immutable/Range.scala
index 47a97664de..43b11b67be 100644
--- a/src/library/scala/collection/immutable/Range.scala
+++ b/src/library/scala/collection/immutable/Range.scala
@@ -39,25 +39,34 @@ class Range(val start: Int, val end: Int, val step: Int) extends IndexedSeq[Int]
def isInclusive = false
- protected def limit = end
-
override def foreach[U](f: Int => U) {
- var i = start
- while (if (step > 0) i < limit else i > limit) {
+ if (fullLength > 0) {
+ val last = this.last
+ var i = start
+ while (i != last) {
+ f(i)
+ i += step
+ }
f(i)
- i += step
}
}
- lazy val length: Int = {
- def plen(start: Int, limit: Int, step: Int) =
- if (limit <= start) 0 else (limit - start - 1) / step + 1
- if (step > 0) plen(start, limit, step)
- else plen(limit, start, -step)
+ override def last: Int = if (step == 1 || step == -1) {
+ end - step
+ } else {
+ val size = end.toLong - start.toLong
+ val inclusiveLast = (size / step.toLong * step.toLong + start.toLong).toInt
+ if (size % step == 0) inclusiveLast - step else inclusiveLast
}
- final override def isEmpty =
- if (step > 0) start >= limit else start <= limit
+ def length: Int = fullLength.toInt
+
+ protected def fullLength: Long = if (end > start == step > 0 && start != end)
+ ((last.toLong - start.toLong) / step.toLong + 1)
+ else
+ 0
+
+ final override def isEmpty = length == 0
@inline
final def apply(idx: Int): Int = {
@@ -66,12 +75,19 @@ class Range(val start: Int, val end: Int, val step: Int) extends IndexedSeq[Int]
}
// take and drop have to be tolerant of large values without overflowing
- private def locationAfterN(n: Int) = start + step * (0 max n min length)
+ private def locationAfterN(n: Int) = if (n > 0) {
+ if (step > 0)
+ ((start.toLong + step.toLong * n.toLong) min last.toLong).toInt
+ else
+ ((start.toLong + step.toLong * n.toLong) max last.toLong).toInt
+ } else {
+ start
+ }
- final override def take(n: Int): Range = {
- val limit1 = locationAfterN(n)
- if (step > 0) Range(start, limit1 min limit, step)
- else Range(start, limit1 max limit, step)
+ final override def take(n: Int): Range = if (n > 0 && length > 0) {
+ Range(start, locationAfterN(n - 1), step).inclusive
+ } else {
+ Range(start, start, step)
}
final override def drop(n: Int): Range =
@@ -85,7 +101,11 @@ class Range(val start: Int, val end: Int, val step: Int) extends IndexedSeq[Int]
private def skip(p: Int => Boolean): Int = {
var s = start
- while ((if (step > 0) s < limit else s > limit) && p(s)) s += step
+ if (length > 0) {
+ val last = this.last
+ while ((if (step > 0) s <= last else s >= last) && p(s))
+ s += step
+ }
s
}
@@ -103,16 +123,18 @@ class Range(val start: Int, val end: Int, val step: Int) extends IndexedSeq[Int]
final override def dropRight(n: Int): Range = take(length - n)
- final override def reverse: Range = new Range.Inclusive(last, start, -step)
+ final override def reverse: Range = if (length > 0) new Range.Inclusive(last, start, -step) else this
/** Make range inclusive.
- * @pre if (step > 0) end != MaxInt else end != MinInt
*/
def inclusive = new Range.Inclusive(start, end, step)
- def contains(x: Int): Boolean =
- if (step > 0) start <= x && x < limit && (x - start) % step == 0
- else start >= x && x > limit && (start - x) % step == 0
+ final def contains(x: Int): Boolean = if (length > 0) {
+ if (step > 0) start <= x && x <= last && (x - start) % step == 0
+ else start >= x && x >= last && (start - x) % step == 0
+ } else {
+ false
+ }
override def equals(other: Any) = other match {
case x: Range =>
@@ -139,37 +161,45 @@ object Range {
class Inclusive(start: Int, end: Int, step: Int) extends Range(start, end, step) {
override def isInclusive = true
- override protected val limit = end + math.signum(step)
override protected def copy(start: Int, end: Int, step: Int): Range = new Inclusive(start, end, step)
+ override def last: Int = if (step == 1 || step == -1)
+ end
+ else
+ ((end.toLong - start.toLong) / step.toLong * step.toLong + start.toLong).toInt
+ protected override def fullLength: Long = if (end > start == step > 0 || start == end)
+ ((last.toLong - start.toLong) / step.toLong + 1)
+ else
+ 0
}
- /** Make a range from `start` until `end` (exclusive) with step value 1.
+ /** Make a range from `start` until `end` (exclusive) with given step value.
+ * @note step != 0
*/
def apply(start: Int, end: Int, step: Int): Range = new Range(start, end, step)
/** Make an range from `start` to `end` inclusive with step value 1.
- * @pre end != MaxInt
*/
def apply(start: Int, end: Int): Range with ByOne = new Range(start, end, 1) with ByOne
/** Make an inclusive range from start to end with given step value.
- * @pre step != 0
- * @pre if (step > 0) end != MaxInt else end != MinInt
+ * @note step != 0
*/
def inclusive(start: Int, end: Int, step: Int): Range.Inclusive = new Inclusive(start, end, step)
/** Make an inclusive range from start to end with step value 1.
- * @pre end != MaxInt
*/
def inclusive(start: Int, end: Int): Range.Inclusive with ByOne = new Inclusive(start, end, 1) with ByOne
trait ByOne extends Range {
override final def foreach[U](f: Int => U) {
- var i = start
- val l = limit
- while (i < l) {
+ if (length > 0) {
+ val last = this.last
+ var i = start
+ while (i != last) {
+ f(i)
+ i += 1
+ }
f(i)
- i += 1
}
}
}
diff --git a/src/library/scala/collection/immutable/RedBlack.scala b/src/library/scala/collection/immutable/RedBlack.scala
index dfb34552cd..e7b4f3c978 100644
--- a/src/library/scala/collection/immutable/RedBlack.scala
+++ b/src/library/scala/collection/immutable/RedBlack.scala
@@ -12,7 +12,8 @@
package scala.collection
package immutable
-/**
+/** An base class containing the implementations for TreeMaps and TreeSets
+ *
* @since 2.3
*/
@serializable @SerialVersionUID(8691885935445612921L)
diff --git a/src/library/scala/collection/immutable/Set.scala b/src/library/scala/collection/immutable/Set.scala
index be1e86bcdd..1bec1b9a48 100644
--- a/src/library/scala/collection/immutable/Set.scala
+++ b/src/library/scala/collection/immutable/Set.scala
@@ -41,12 +41,22 @@ trait Set[A] extends Iterable[A]
object Set extends SetFactory[Set] {
implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, Set[A]] = setCanBuildFrom[A]
- override def empty[A]: Set[A] = new EmptySet[A]
+ override def empty[A]: Set[A] = EmptySet.asInstanceOf[Set[A]]
private val hashSeed = "Set".hashCode
/** An optimized representation for immutable empty sets */
@serializable
+ private object EmptySet extends Set[Any] {
+ override def size: Int = 0
+ def contains(elem: Any): Boolean = false
+ def + (elem: Any): Set[Any] = new Set1(elem)
+ def - (elem: Any): Set[Any] = this
+ def iterator: Iterator[Any] = Iterator.empty
+ override def foreach[U](f: Any => U): Unit = {}
+ }
+
+ @serializable @deprecated("use `Set.empty' instead")
class EmptySet[A] extends Set[A] {
override def size: Int = 0
def contains(elem: A): Boolean = false
@@ -66,7 +76,7 @@ object Set extends SetFactory[Set] {
if (contains(elem)) this
else new Set2(elem1, elem)
def - (elem: A): Set[A] =
- if (elem == elem1) new EmptySet[A]
+ if (elem == elem1) Set.empty
else this
def iterator: Iterator[A] =
Iterator(elem1)
diff --git a/src/library/scala/collection/immutable/SortedMap.scala b/src/library/scala/collection/immutable/SortedMap.scala
index 316cab9b50..919b529a49 100644
--- a/src/library/scala/collection/immutable/SortedMap.scala
+++ b/src/library/scala/collection/immutable/SortedMap.scala
@@ -31,6 +31,8 @@ trait SortedMap[A, +B] extends Map[A, B]
override protected[this] def newBuilder : Builder[(A, B), SortedMap[A, B]] =
SortedMap.newBuilder[A, B]
+ override def empty: SortedMap[A, B] = SortedMap.empty
+
override def updated [B1 >: B](key: A, value: B1): SortedMap[A, B1] = this + ((key, value))
/** Add a key/value pair to this map.
@@ -56,16 +58,8 @@ trait SortedMap[A, +B] extends Map[A, B]
*
* @param elems the traversable object.
*/
- override def ++[B1 >: B](elems: scala.collection.Traversable[(A, B1)]): SortedMap[A, B1] =
- ((repr: SortedMap[A, B1]) /: elems) (_ + _)
-
- /** Adds a number of elements provided by an iterator
- * and returns a new collection with the added elements.
- *
- * @param iter the iterator
- */
- override def ++[B1 >: B] (iter: Iterator[(A, B1)]): SortedMap[A, B1] =
- ((repr: SortedMap[A, B1]) /: iter) (_ + _)
+ override def ++[B1 >: B](xs: TraversableOnce[(A, B1)]): SortedMap[A, B1] =
+ ((repr: SortedMap[A, B1]) /: xs) (_ + _)
}
/**
diff --git a/src/library/scala/collection/immutable/Stack.scala b/src/library/scala/collection/immutable/Stack.scala
index a5d7e9515a..640fb39af5 100644
--- a/src/library/scala/collection/immutable/Stack.scala
+++ b/src/library/scala/collection/immutable/Stack.scala
@@ -37,7 +37,7 @@ object Stack extends SeqFactory[Stack] {
@serializable @SerialVersionUID(1976480595012942526L)
class Stack[+A] protected (protected val elems: List[A]) extends LinearSeq[A]
with GenericTraversableTemplate[A, Stack]
- with LinearSeqLike[A, Stack[A]] {
+ with LinearSeqOptimized[A, Stack[A]] {
override def companion: GenericCompanion[Stack] = Stack
def this() = this(Nil)
@@ -74,18 +74,8 @@ class Stack[+A] protected (protected val elems: List[A]) extends LinearSeq[A]
* @param elems the iterator object.
* @return the stack with the new elements on top.
*/
- def pushAll[B >: A](elems: Iterator[B]): Stack[B] =
- ((this: Stack[B]) /: elems)(_ push _)
-
- /** Push all elements provided by the given traversable object onto
- * the stack. The last element returned by the iterable object
- * will be on top of the new stack.
- *
- * @param elems the iterable object.
- * @return the stack with the new elements on top.
- */
- def pushAll[B >: A](elems: scala.collection.Traversable[B]): Stack[B] =
- ((this: Stack[B]) /: elems)(_ push _)
+ def pushAll[B >: A](xs: TraversableOnce[B]): Stack[B] =
+ ((this: Stack[B]) /: xs.toIterator)(_ push _)
/** Returns the top element of the stack. An error is signaled if
* there is no element on the stack.
diff --git a/src/library/scala/collection/immutable/Stream.scala b/src/library/scala/collection/immutable/Stream.scala
index 1377fbe59d..3b10963ddb 100644
--- a/src/library/scala/collection/immutable/Stream.scala
+++ b/src/library/scala/collection/immutable/Stream.scala
@@ -40,7 +40,7 @@ import scala.annotation.tailrec
*/
abstract class Stream[+A] extends LinearSeq[A]
with GenericTraversableTemplate[A, Stream]
- with LinearSeqLike[A, Stream[A]] {
+ with LinearSeqOptimized[A, Stream[A]] {
self =>
override def companion: GenericCompanion[Stream] = Stream
@@ -113,17 +113,21 @@ self =>
* then StreamBuilder will be chosen for the implicit.
* we recognize that fact and optimize to get more laziness.
*/
- override def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[Stream[A], B, That]): That = {
+ override def ++[B >: A, That](that: TraversableOnce[B])(implicit bf: CanBuildFrom[Stream[A], B, That]): That = {
// we assume there is no other builder factory on streams and therefore know that That = Stream[A]
(if (isEmpty) that.toStream
else new Stream.Cons(head, (tail ++ that).asInstanceOf[Stream[A]])).asInstanceOf[That]
}
- /** Create a new stream which contains all elements of this stream
- * followed by all elements of Iterator `that'
+ /**
+ * Create a new stream which contains all intermediate results of applying the operator
+ * to subsequent elements left to right.
+ * @note This works because the target type of the Builder That is a Stream.
*/
- override def++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[Stream[A], B, That]): That =
- this ++ that.toStream
+ override final def scanLeft[B, That](z: B)(op: (B, A) => B)(implicit bf: CanBuildFrom[Stream[A], B, That]): That = {
+ (if (this.isEmpty) Stream(z)
+ else new Stream.Cons(z, tail.scanLeft(op(z, head))(op).asInstanceOf[Stream[B]])).asInstanceOf[That]
+ }
/** Returns the stream resulting from applying the given function
* <code>f</code> to each element of this stream.
@@ -344,9 +348,9 @@ self =>
/** Builds a new stream from this stream in which any duplicates (wrt to ==) removed.
* Among duplicate elements, only the first one is retained in the result stream
*/
- override def removeDuplicates: Stream[A] =
+ override def distinct: Stream[A] =
if (isEmpty) this
- else new Stream.Cons(head, tail.filter(head !=).removeDuplicates)
+ else new Stream.Cons(head, tail.filter(head !=).distinct)
/** Returns a new sequence of given length containing the elements of this sequence followed by zero
* or more occurrences of given elements.
@@ -420,12 +424,12 @@ object Stream extends SeqFactory[Stream] {
import scala.collection.{Iterable, Seq, IndexedSeq}
/** A builder for streams
- * @note: This builder is lazy only in the sense that it does not go downs the spine
- * of traversables that are added as a whole. If more laziness can be achieved,
- * this builder should be bypassed.
+ * @note This builder is lazy only in the sense that it does not go downs the spine
+ * of traversables that are added as a whole. If more laziness can be achieved,
+ * this builder should be bypassed.
*/
class StreamBuilder[A] extends scala.collection.mutable.LazyBuilder[A, Stream[A]] {
- def result: Stream[A] = (for (xs <- parts.iterator; x <- xs.toIterable.iterator) yield x).toStream
+ def result: Stream[A] = parts.toStream flatMap (_.toStream)
}
object Empty extends Stream[Nothing] {
diff --git a/src/library/scala/collection/immutable/StringLike.scala b/src/library/scala/collection/immutable/StringLike.scala
index 500de352f6..5b5a627cfe 100644
--- a/src/library/scala/collection/immutable/StringLike.scala
+++ b/src/library/scala/collection/immutable/StringLike.scala
@@ -34,7 +34,7 @@ import StringLike._
/**
* @since 2.8
*/
-trait StringLike[+Repr] extends IndexedSeqLike[Char, Repr] with Ordered[String] {
+trait StringLike[+Repr] extends IndexedSeqOptimized[Char, Repr] with Ordered[String] {
self =>
/** Creates a string builder buffer as builder for this class */
@@ -263,7 +263,7 @@ self =>
* @param args the arguments used to instantiating the pattern.
* @throws java.lang.IllegalArgumentException
*/
- def format(l: java.util.Locale, args: Any*): String =
+ def formatLocal(l: java.util.Locale, args: Any*): String =
java.lang.String.format(l, toString, args map unwrapArg: _*)
}
diff --git a/src/library/scala/collection/immutable/TreeSet.scala b/src/library/scala/collection/immutable/TreeSet.scala
index 1a3ed38e1c..79e1a6b00b 100644
--- a/src/library/scala/collection/immutable/TreeSet.scala
+++ b/src/library/scala/collection/immutable/TreeSet.scala
@@ -19,8 +19,7 @@ import mutable.{Builder, AddingBuilder}
*
* @since 1
*/
-object TreeSet extends SortedSetFactory[TreeSet]{
-
+object TreeSet extends SortedSetFactory[TreeSet] {
implicit def implicitBuilder[A](implicit ordering: Ordering[A]): Builder[A, TreeSet[A]] = newBuilder[A](ordering)
override def newBuilder[A](implicit ordering: Ordering[A]): Builder[A, TreeSet[A]] =
new AddingBuilder(empty[A](ordering))
@@ -28,7 +27,6 @@ object TreeSet extends SortedSetFactory[TreeSet]{
/** The empty set of this type
*/
def empty[A](implicit ordering: Ordering[A]) = new TreeSet[A]
-
}
/** This class implements immutable sets using a tree.
diff --git a/src/library/scala/collection/immutable/Vector.scala b/src/library/scala/collection/immutable/Vector.scala
index 1326768090..6defe66d6f 100644
--- a/src/library/scala/collection/immutable/Vector.scala
+++ b/src/library/scala/collection/immutable/Vector.scala
@@ -19,24 +19,25 @@ import scala.collection.mutable.Builder
object Vector extends SeqFactory[Vector] {
- /*private[immutable]*/ val BF = new GenericCanBuildFrom[Nothing] {
+ private[immutable] val BF = new GenericCanBuildFrom[Nothing] {
override def apply() = newBuilder[Nothing]
}
@inline implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, Vector[A]] =
BF.asInstanceOf[CanBuildFrom[Coll, A, Vector[A]]]
def newBuilder[A]: Builder[A, Vector[A]] = new VectorBuilder[A]
- /*private[immutable]*/ val NIL = new Vector[Nothing](0, 0, 0)
+ private[immutable] val NIL = new Vector[Nothing](0, 0, 0)
@inline override def empty[A]: Vector[A] = NIL
}
-// TODO: most members are still public -> restrict access (caveat: private prevents inlining)
+// in principle, most members should be private. however, access privileges must
+// be carefully chosen to not prevent method inlining
@serializable
-final class Vector[+A](startIndex: Int, endIndex: Int, focus: Int) extends Seq[A]
+final class Vector[+A](startIndex: Int, endIndex: Int, focus: Int) extends IndexedSeq[A]
with GenericTraversableTemplate[A, Vector]
- with SeqLike[A, Vector[A]]
- with VectorPointer[A @uncheckedVariance] {
+ with IndexedSeqLike[A, Vector[A]]
+ with VectorPointer[A @uncheckedVariance] { self =>
override def companion: GenericCompanion[Vector] = Vector
@@ -45,7 +46,7 @@ override def companion: GenericCompanion[Vector] = Vector
//assert(focus >= 0, focus+"<0")
//assert(focus <= endIndex, focus+">"+endIndex)
- /*private*/ var dirty = false
+ private[immutable] var dirty = false
def length = endIndex - startIndex
@@ -60,20 +61,35 @@ override def companion: GenericCompanion[Vector] = Vector
s
}
+
+ // can still be improved
+ override /*SeqLike*/
+ def reverseIterator: Iterator[A] = new Iterator[A] {
+ private var i = self.length
+ def hasNext: Boolean = 0 < i
+ def next: A =
+ if (0 < i) {
+ i -= 1
+ self(i)
+ } else Iterator.empty.next
+ }
+
+ // TODO: reverse
+
// TODO: check performance of foreach/map etc. should override or not?
// Ideally, clients will inline calls to map all the way down, including the iterator/builder methods.
// In principle, escape analysis could even remove the iterator/builder allocations and do it
// with local variables exclusively. But we're not quite there yet ...
- @inline def foreach0[U](f: A => U): Unit = iterator.foreach0(f)
- @inline def map0[B, That](f: A => B)(implicit bf: CanBuildFrom[Vector[A], B, That]): That = {
+ @deprecated("this method is experimental and will be removed in a future release")
+ @inline def foreachFast[U](f: A => U): Unit = iterator.foreachFast(f)
+ @deprecated("this method is experimental and will be removed in a future release")
+ @inline def mapFast[B, That](f: A => B)(implicit bf: CanBuildFrom[Vector[A], B, That]): That = {
val b = bf(repr)
- foreach0(x => b += f(x))
+ foreachFast(x => b += f(x))
b.result
}
- // TODO: reverse
- // TODO: reverseIterator
def apply(index: Int): A = {
val idx = checkRangeConvert(index)
@@ -108,41 +124,71 @@ override def companion: GenericCompanion[Vector] = Vector
}
override def take(n: Int): Vector[A] = {
- if (n < 0) throw new IllegalArgumentException(n.toString)
- if (startIndex + n < endIndex) {
+ if (n <= 0)
+ Vector.empty
+ else if (startIndex + n < endIndex)
dropBack0(startIndex + n)
- } else
+ else
this
}
override def drop(n: Int): Vector[A] = {
- if (n < 0) throw new IllegalArgumentException(n.toString)
- if (startIndex + n < endIndex) {
+ if (n <= 0)
+ this
+ else if (startIndex + n < endIndex)
dropFront0(startIndex + n)
- } else
+ else
Vector.empty
}
override def takeRight(n: Int): Vector[A] = {
- if (n < 0) throw new IllegalArgumentException(n.toString)
- if (endIndex - n > startIndex) {
+ if (n <= 0)
+ Vector.empty
+ else if (endIndex - n > startIndex)
dropFront0(endIndex - n)
- } else
+ else
this
}
override def dropRight(n: Int): Vector[A] = {
- if (n < 0) throw new IllegalArgumentException(n.toString)
- if (endIndex - n > startIndex) {
+ if (n <= 0)
+ this
+ else if (endIndex - n > startIndex)
dropBack0(endIndex - n)
- } else
+ else
Vector.empty
}
+ override /*IterableLike*/ def head: A = {
+ if (isEmpty) throw new UnsupportedOperationException("empty.head")
+ apply(0)
+ }
+
+ override /*TraversableLike*/ def tail: Vector[A] = {
+ if (isEmpty) throw new UnsupportedOperationException("empty.tail")
+ drop(1)
+ }
+
+ override /*TraversableLike*/ def last: A = {
+ if (isEmpty) throw new UnsupportedOperationException("empty.last")
+ apply(length-1)
+ }
+
+ override /*TraversableLike*/ def init: Vector[A] = {
+ if (isEmpty) throw new UnsupportedOperationException("empty.init")
+ dropRight(1)
+ }
+
+ override /*IterableLike*/ def slice(from: Int, until: Int): Vector[A] =
+ take(until).drop(from)
+
+ override /*IterableLike*/ def splitAt(n: Int): (Vector[A], Vector[A]) = (take(n), drop(n))
+
+
// semi-private api
- def updateAt[B >: A](index: Int, elem: B): Vector[B] = {
+ private[immutable] def updateAt[B >: A](index: Int, elem: B): Vector[B] = {
val idx = checkRangeConvert(index)
val s = new Vector[B](startIndex, endIndex, idx)
s.initFrom(this)
@@ -153,7 +199,6 @@ override def companion: GenericCompanion[Vector] = Vector
}
-
private def gotoPosWritable(oldIndex: Int, newIndex: Int, xor: Int) = if (dirty) {
gotoPosWritable1(oldIndex, newIndex, xor)
} else {
@@ -168,7 +213,7 @@ override def companion: GenericCompanion[Vector] = Vector
dirty = true
}
- def appendFront[B>:A](value: B): Vector[B] = {
+ private[immutable] def appendFront[B>:A](value: B): Vector[B] = {
if (endIndex != startIndex) {
var blockIndex = (startIndex - 1) & ~31
var lo = (startIndex - 1) & 31
@@ -263,7 +308,7 @@ override def companion: GenericCompanion[Vector] = Vector
}
}
- def appendBack[B>:A](value: B): Vector[B] = {
+ private[immutable] def appendBack[B>:A](value: B): Vector[B] = {
// //println("------- append " + value)
// debug()
if (endIndex != startIndex) {
@@ -361,22 +406,22 @@ override def companion: GenericCompanion[Vector] = Vector
display5 = copyRange(display5, oldLeft, newLeft)
}
- def zeroLeft(array: Array[AnyRef], index: Int): Unit = {
+ private def zeroLeft(array: Array[AnyRef], index: Int): Unit = {
var i = 0; while (i < index) { array(i) = null; i+=1 }
}
- def zeroRight(array: Array[AnyRef], index: Int): Unit = {
+ private def zeroRight(array: Array[AnyRef], index: Int): Unit = {
var i = index; while (i < array.length) { array(i) = null; i+=1 }
}
- def copyLeft(array: Array[AnyRef], right: Int): Array[AnyRef] = {
+ private def copyLeft(array: Array[AnyRef], right: Int): Array[AnyRef] = {
// if (array eq null)
// println("OUCH!!! " + right + "/" + depth + "/"+startIndex + "/" + endIndex + "/" + focus)
val a2 = new Array[AnyRef](array.length)
Platform.arraycopy(array, 0, a2, 0, right)
a2
}
- def copyRight(array: Array[AnyRef], left: Int): Array[AnyRef] = {
+ private def copyRight(array: Array[AnyRef], left: Int): Array[AnyRef] = {
val a2 = new Array[AnyRef](array.length)
Platform.arraycopy(array, left, a2, left, a2.length - left)
a2
@@ -592,18 +637,17 @@ final class VectorIterator[+A](_startIndex: Int, _endIndex: Int) extends Iterato
res
}
- // TODO: take
- // TODO: drop
+ // TODO: drop (important?)
- // TODO: remove!
- @inline def foreach0[U](f: A => U) { while (hasNext) f(next()) }
+ @deprecated("this method is experimental and will be removed in a future release")
+ @inline def foreachFast[U](f: A => U) { while (hasNext) f(next()) }
}
final class VectorBuilder[A]() extends Builder[A,Vector[A]] with VectorPointer[A @uncheckedVariance] {
- // TODO: possible alternative: start with display0 = null, blockIndex = -32, lo = 32
- // to avoid allocation initial array if the result will be empty anyways
+ // possible alternative: start with display0 = null, blockIndex = -32, lo = 32
+ // to avoid allocating initial array if the result will be empty anyways
display0 = new Array[AnyRef](32)
depth = 1
@@ -612,7 +656,7 @@ final class VectorBuilder[A]() extends Builder[A,Vector[A]] with VectorPointer[A
private var lo = 0
def += (elem: A): this.type = {
- if (lo == 32) {
+ if (lo >= display0.length) {
val newBlockIndex = blockIndex+32
gotoNextBlockStartWritable(newBlockIndex, blockIndex ^ newBlockIndex)
blockIndex = newBlockIndex
@@ -624,11 +668,12 @@ final class VectorBuilder[A]() extends Builder[A,Vector[A]] with VectorPointer[A
}
def result: Vector[A] = {
- if (blockIndex + lo == 0)
+ val size = blockIndex + lo
+ if (size == 0)
return Vector.empty
- val s = new Vector[A](0, blockIndex + lo, 0) // TODO: should focus front or back?
+ val s = new Vector[A](0, size, 0) // should focus front or back?
s.initFrom(this)
- if (depth > 1) s.gotoPos(0, blockIndex + lo)
+ if (depth > 1) s.gotoPos(0, size - 1) // we're currently focused to size - 1, not size!
s
}
@@ -643,18 +688,18 @@ final class VectorBuilder[A]() extends Builder[A,Vector[A]] with VectorPointer[A
private[immutable] trait VectorPointer[T] {
- var depth: Int = _
- var display0: Array[AnyRef] = _
- var display1: Array[AnyRef] = _
- var display2: Array[AnyRef] = _
- var display3: Array[AnyRef] = _
- var display4: Array[AnyRef] = _
- var display5: Array[AnyRef] = _
+ private[immutable] var depth: Int = _
+ private[immutable] var display0: Array[AnyRef] = _
+ private[immutable] var display1: Array[AnyRef] = _
+ private[immutable] var display2: Array[AnyRef] = _
+ private[immutable] var display3: Array[AnyRef] = _
+ private[immutable] var display4: Array[AnyRef] = _
+ private[immutable] var display5: Array[AnyRef] = _
// used
- final def initFrom[U](that: VectorPointer[U]): Unit = initFrom(that, that.depth)
+ private[immutable] final def initFrom[U](that: VectorPointer[U]): Unit = initFrom(that, that.depth)
- final def initFrom[U](that: VectorPointer[U], depth: Int) = {
+ private[immutable] final def initFrom[U](that: VectorPointer[U], depth: Int) = {
this.depth = depth
(depth - 1) match {
case -1 =>
@@ -690,7 +735,7 @@ private[immutable] trait VectorPointer[T] {
// requires structure is at pos oldIndex = xor ^ index
- final def getElem(index: Int, xor: Int): T = {
+ private[immutable] final def getElem(index: Int, xor: Int): T = {
if (xor < (1 << 5)) { // level = 0
display0(index & 31).asInstanceOf[T]
} else
@@ -717,7 +762,7 @@ private[immutable] trait VectorPointer[T] {
// go to specific position
// requires structure is at pos oldIndex = xor ^ index,
// ensures structure is at pos index
- final def gotoPos(index: Int, xor: Int): Unit = {
+ private[immutable] final def gotoPos(index: Int, xor: Int): Unit = {
if (xor < (1 << 5)) { // level = 0 (could maybe removed)
} else
if (xor < (1 << 10)) { // level = 1
@@ -754,7 +799,7 @@ private[immutable] trait VectorPointer[T] {
// USED BY ITERATOR
// xor: oldIndex ^ index
- final def gotoNextBlockStart(index: Int, xor: Int): Unit = { // goto block start pos
+ private[immutable] final def gotoNextBlockStart(index: Int, xor: Int): Unit = { // goto block start pos
if (xor < (1 << 10)) { // level = 1
display0 = display1((index >> 5) & 31).asInstanceOf[Array[AnyRef]]
} else
@@ -787,7 +832,7 @@ private[immutable] trait VectorPointer[T] {
// USED BY BUILDER
// xor: oldIndex ^ index
- final def gotoNextBlockStartWritable(index: Int, xor: Int): Unit = { // goto block start pos
+ private[immutable] final def gotoNextBlockStartWritable(index: Int, xor: Int): Unit = { // goto block start pos
if (xor < (1 << 10)) { // level = 1
if (depth == 1) { display1 = new Array(32); display1(0) = display0; depth+=1}
display0 = new Array(32)
@@ -841,14 +886,15 @@ private[immutable] trait VectorPointer[T] {
// STUFF BELOW USED BY APPEND / UPDATE
- final def copyOf(a: Array[AnyRef]) = {
+ private[immutable] final def copyOf(a: Array[AnyRef]) = {
//println("copy")
+ if (a eq null) println ("NULL")
val b = new Array[AnyRef](a.length)
Platform.arraycopy(a, 0, b, 0, a.length)
b
}
- final def nullSlotAndCopy(array: Array[AnyRef], index: Int) = {
+ private[immutable] final def nullSlotAndCopy(array: Array[AnyRef], index: Int) = {
//println("copy and null")
val x = array(index)
array(index) = null
@@ -860,7 +906,7 @@ private[immutable] trait VectorPointer[T] {
// requires structure is at pos index
// ensures structure is clean and at pos index and writable at all levels except 0
- final def stabilize(index: Int) = (depth - 1) match {
+ private[immutable] final def stabilize(index: Int) = (depth - 1) match {
case 5 =>
display5 = copyOf(display5)
display4 = copyOf(display4)
@@ -901,16 +947,13 @@ private[immutable] trait VectorPointer[T] {
-
-
-
/// USED IN UPDATE AND APPEND BACK
// prepare for writing at an existing position
// requires structure is clean and at pos oldIndex = xor ^ newIndex,
// ensures structure is dirty and at pos newIndex and writable at level 0
- final def gotoPosWritable0(newIndex: Int, xor: Int): Unit = (depth - 1) match {
+ private[immutable] final def gotoPosWritable0(newIndex: Int, xor: Int): Unit = (depth - 1) match {
case 5 =>
display5 = copyOf(display5)
display4 = nullSlotAndCopy(display5, (newIndex >> 25) & 31).asInstanceOf[Array[AnyRef]]
@@ -943,7 +986,7 @@ private[immutable] trait VectorPointer[T] {
// requires structure is dirty and at pos oldIndex,
// ensures structure is dirty and at pos newIndex and writable at level 0
- final def gotoPosWritable1(oldIndex: Int, newIndex: Int, xor: Int): Unit = {
+ private[immutable] final def gotoPosWritable1(oldIndex: Int, newIndex: Int, xor: Int): Unit = {
if (xor < (1 << 5)) { // level = 0
display0 = copyOf(display0)
} else
@@ -1009,7 +1052,7 @@ private[immutable] trait VectorPointer[T] {
// USED IN DROP
- final def copyRange(array: Array[AnyRef], oldLeft: Int, newLeft: Int) = {
+ private[immutable] final def copyRange(array: Array[AnyRef], oldLeft: Int, newLeft: Int) = {
val elems = new Array[AnyRef](32)
Platform.arraycopy(array, oldLeft, elems, newLeft, 32 - math.max(newLeft,oldLeft))
elems
@@ -1023,7 +1066,7 @@ private[immutable] trait VectorPointer[T] {
// requires structure is clean and at pos oldIndex,
// ensures structure is dirty and at pos newIndex and writable at level 0
- final def gotoFreshPosWritable0(oldIndex: Int, newIndex: Int, xor: Int): Unit = { // goto block start pos
+ private[immutable] final def gotoFreshPosWritable0(oldIndex: Int, newIndex: Int, xor: Int): Unit = { // goto block start pos
if (xor < (1 << 5)) { // level = 0
//println("XXX clean with low xor")
} else
@@ -1103,7 +1146,7 @@ private[immutable] trait VectorPointer[T] {
// requires structure is dirty and at pos oldIndex,
// ensures structure is dirty and at pos newIndex and writable at level 0
- final def gotoFreshPosWritable1(oldIndex: Int, newIndex: Int, xor: Int): Unit = {
+ private[immutable] final def gotoFreshPosWritable1(oldIndex: Int, newIndex: Int, xor: Int): Unit = {
stabilize(oldIndex)
gotoFreshPosWritable0(oldIndex, newIndex, xor)
}
@@ -1113,7 +1156,7 @@ private[immutable] trait VectorPointer[T] {
// DEBUG STUFF
- def debug(): Unit = {
+ private[immutable] def debug(): Unit = {
return
/*
//println("DISPLAY 5: " + display5 + " ---> " + (if (display5 ne null) display5.map(x=> if (x eq null) "." else x + "->" +x.asInstanceOf[Array[AnyRef]].mkString("")).mkString(" ") else "null"))
diff --git a/src/library/scala/collection/interfaces/MapMethods.scala b/src/library/scala/collection/interfaces/MapMethods.scala
index dbe05906b1..fd6e7ad2a7 100644
--- a/src/library/scala/collection/interfaces/MapMethods.scala
+++ b/src/library/scala/collection/interfaces/MapMethods.scala
@@ -30,15 +30,15 @@ with SubtractableMethods[A, This]
def apply(key: A): B
def contains(key: A): Boolean
def isDefinedAt(key: A): Boolean
- def keySet: Set[A]
+ def keys: Iterable[A]
def keysIterator: Iterator[A]
- def valuesIterable: Iterable[B]
+ def keySet: Set[A]
+ def values: Iterable[B]
def valuesIterator: Iterator[B]
def default(key: A): B
def filterKeys(p: A => Boolean): DefaultMap[A, B]
def mapValues[C](f: B => C): DefaultMap[A, C]
def updated [B1 >: B](key: A, value: B1): Map[A, B1]
def + [B1 >: B] (elem1: (A, B1), elem2: (A, B1), elems: (A, B1) *): Map[A, B1]
- def ++[B1 >: B](elems: Traversable[(A, B1)]): Map[A, B1]
- def ++[B1 >: B] (iter: Iterator[(A, B1)]): Map[A, B1]
+ def ++[B1 >: B](xs: TraversableOnce[(A, B1)]): Map[A, B1]
}
diff --git a/src/library/scala/collection/interfaces/SeqMethods.scala b/src/library/scala/collection/interfaces/SeqMethods.scala
index df0307174d..401c5e6c55 100644
--- a/src/library/scala/collection/interfaces/SeqMethods.scala
+++ b/src/library/scala/collection/interfaces/SeqMethods.scala
@@ -44,7 +44,7 @@ trait SeqMethods[+A, +This <: SeqLike[A, This] with Seq[A]] extends IterableMeth
def padTo[B >: A, That](len: Int, elem: B)(implicit bf: CanBuildFrom[This, B, That]): That
def patch[B >: A, That](from: Int, patch: Seq[B], replaced: Int)(implicit bf: CanBuildFrom[This, B, That]): That
def prefixLength(p: A => Boolean): Int
- def removeDuplicates: This
+ def distinct: This
def reverse: This
def reverseIterator: Iterator[A]
def segmentLength(p: A => Boolean, from: Int): Int
diff --git a/src/library/scala/collection/interfaces/SetMethods.scala b/src/library/scala/collection/interfaces/SetMethods.scala
index 8a3142b44a..453143b790 100644
--- a/src/library/scala/collection/interfaces/SetMethods.scala
+++ b/src/library/scala/collection/interfaces/SetMethods.scala
@@ -21,8 +21,7 @@ trait AddableMethods[A, +This <: Addable[A, This]] {
protected def repr: This
def +(elem: A): This
def + (elem1: A, elem2: A, elems: A*): This
- def ++ (elems: Traversable[A]): This
- def ++ (iter: Iterator[A]): This
+ def ++ (xs: TraversableOnce[A]): This
}
/**
@@ -32,8 +31,7 @@ trait SubtractableMethods[A, +This <: Subtractable[A, This]] {
protected def repr: This
def -(elem: A): This
def -(elem1: A, elem2: A, elems: A*): This
- def --(elems: Traversable[A]): This
- def --(iter: Iterator[A]): This
+ def --(xs: TraversableOnce[A]): This
}
/**
diff --git a/src/library/scala/collection/interfaces/TraversableMethods.scala b/src/library/scala/collection/interfaces/TraversableMethods.scala
index 08ade7586d..1fc2451ec0 100644
--- a/src/library/scala/collection/interfaces/TraversableMethods.scala
+++ b/src/library/scala/collection/interfaces/TraversableMethods.scala
@@ -24,11 +24,10 @@ trait TraversableMethods[+A, +This <: TraversableLike[A, This] with Traversable[
// maps/iteration
def flatMap[B, That](f: A => Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That
def map[B, That](f: A => B)(implicit bf: CanBuildFrom[This, B, That]): That
- def partialMap[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[This, B, That]): That
+ def collect[B, That](pf: PartialFunction[A, B])(implicit bf: CanBuildFrom[This, B, That]): That
// new collections
- def ++[B >: A, That](that: Iterator[B])(implicit bf: CanBuildFrom[This, B, That]): That
- def ++[B >: A, That](that: Traversable[B])(implicit bf: CanBuildFrom[This, B, That]): That
+ def ++[B >: A, That](xs: TraversableOnce[B])(implicit bf: CanBuildFrom[This, B, That]): That
def copyToArray[B >: A](xs: Array[B], start: Int): Unit
def copyToArray[B >: A](xs: Array[B], start: Int, len: Int): Unit
def copyToBuffer[B >: A](dest: Buffer[B]): Unit
diff --git a/src/library/scala/collection/interfaces/TraversableOnceMethods.scala b/src/library/scala/collection/interfaces/TraversableOnceMethods.scala
new file mode 100644
index 0000000000..1e71215efd
--- /dev/null
+++ b/src/library/scala/collection/interfaces/TraversableOnceMethods.scala
@@ -0,0 +1,69 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+package scala.collection
+package interfaces
+
+import mutable.Buffer
+
+trait TraversableOnceMethods[+A] {
+ self: TraversableOnce[A] =>
+
+ def foreach[U](f: A => U): Unit
+ protected[this] def reversed: TraversableOnce[A]
+
+ // tests
+ def isEmpty: Boolean
+ def nonEmpty: Boolean
+ def hasDefiniteSize: Boolean
+ def isTraversableAgain: Boolean
+
+ // applying a predicate
+ def forall(p: A => Boolean): Boolean
+ def exists(p: A => Boolean): Boolean
+ def find(p: A => Boolean): Option[A]
+ def count(p: A => Boolean): Int
+
+ // folds
+ def /:[B](z: B)(op: (B, A) => B): B
+ def :\[B](z: B)(op: (A, B) => B): B
+ def foldLeft[B](z: B)(op: (B, A) => B): B
+ def foldRight[B](z: B)(op: (A, B) => B): B
+ def reduceLeft[B >: A](op: (B, A) => B): B
+ def reduceRight[B >: A](op: (A, B) => B): B
+ def reduceLeftOption[B >: A](op: (B, A) => B): Option[B]
+ def reduceRightOption[B >: A](op: (A, B) => B): Option[B]
+
+ def sum[B >: A](implicit num: Numeric[B]): B
+ def product[B >: A](implicit num: Numeric[B]): B
+ def min[B >: A](implicit cmp: Ordering[B]): A
+ def max[B >: A](implicit cmp: Ordering[B]): A
+
+ // copies and conversions
+ def copyToBuffer[B >: A](dest: Buffer[B]): Unit
+ def copyToArray[B >: A](xs: Array[B], start: Int, len: Int): Unit
+ def copyToArray[B >: A](xs: Array[B], start: Int): Unit
+ def copyToArray[B >: A](xs: Array[B]): Unit
+
+ def toArray[B >: A : ClassManifest]: Array[B]
+ def toIterable: Iterable[A]
+ def toIterator: Iterator[A]
+ def toList: List[A]
+ def toMap[T, U](implicit ev: A <:< (T, U)): immutable.Map[T, U]
+ def toSet[B >: A]: immutable.Set[B]
+ def toStream: Stream[A]
+ def toTraversable: Traversable[A]
+
+ def mkString(start: String, sep: String, end: String): String
+ def mkString(sep: String): String
+ def mkString: String
+
+ def addString(buf: StringBuilder, start: String, sep: String, end: String): StringBuilder
+ def addString(buf: StringBuilder, sep: String): StringBuilder
+ def addString(buf: StringBuilder): StringBuilder
+}
diff --git a/src/library/scala/collection/mutable/AddingBuilder.scala b/src/library/scala/collection/mutable/AddingBuilder.scala
index 06822e859b..d16a4a71f3 100644
--- a/src/library/scala/collection/mutable/AddingBuilder.scala
+++ b/src/library/scala/collection/mutable/AddingBuilder.scala
@@ -24,7 +24,7 @@ import generic._
* @version 2.8
* @since 2.8
*/
-class AddingBuilder[Elem, To <: Addable[Elem, To] with scala.collection.Iterable[Elem] with scala.collection.IterableLike[Elem, To]](empty: To)
+class AddingBuilder[Elem, To <: Addable[Elem, To] with collection.Iterable[Elem] with collection.IterableLike[Elem, To]](empty: To)
extends Builder[Elem, To] {
protected var elems: To = empty
def +=(x: Elem): this.type = { elems = elems + x; this }
diff --git a/src/library/scala/collection/mutable/ArrayBuffer.scala b/src/library/scala/collection/mutable/ArrayBuffer.scala
index c88b9d3374..0c6aa9ce0c 100644
--- a/src/library/scala/collection/mutable/ArrayBuffer.scala
+++ b/src/library/scala/collection/mutable/ArrayBuffer.scala
@@ -29,7 +29,7 @@ class ArrayBuffer[A](override protected val initialSize: Int)
extends Buffer[A]
with GenericTraversableTemplate[A, ArrayBuffer]
with BufferLike[A, ArrayBuffer[A]]
- with IndexedSeqLike[A, ArrayBuffer[A]]
+ with IndexedSeqOptimized[A, ArrayBuffer[A]]
with Builder[A, ArrayBuffer[A]]
with ResizableArray[A] {
@@ -43,7 +43,7 @@ class ArrayBuffer[A](override protected val initialSize: Int)
override def sizeHint(len: Int) {
if (len > size && len >= 1) {
- val newarray = new Array[AnyRef](len min 1)
+ val newarray = new Array[AnyRef](len)
Array.copy(array, 0, newarray, 0, size0)
array = newarray
}
@@ -65,10 +65,10 @@ class ArrayBuffer[A](override protected val initialSize: Int)
* via its <code>iterator</code> method. The identity of the
* buffer is returned.
*
- * @param iter the iterfable object.
+ * @param iter the iterable object.
* @return the updated buffer.
*/
- override def ++=(iter: Traversable[A]): this.type = iter match {
+ override def ++=(xs: TraversableOnce[A]): this.type = xs match {
case v: IndexedSeq[_] =>
val n = v.length
ensureSize(size0 + n)
@@ -76,7 +76,7 @@ class ArrayBuffer[A](override protected val initialSize: Int)
size0 += n
this
case _ =>
- super.++=(iter)
+ super.++=(xs)
}
/** Prepends a single element to this buffer and return
@@ -101,7 +101,7 @@ class ArrayBuffer[A](override protected val initialSize: Int)
* @param iter the iterable object.
* @return the updated buffer.
*/
- override def ++=:(iter: Traversable[A]): this.type = { insertAll(0, iter); this }
+ override def ++=:(xs: TraversableOnce[A]): this.type = { insertAll(0, xs.toTraversable); this }
/** Inserts new elements at the index <code>n</code>. Opposed to method
* <code>update</code>, this method will not replace an element with a
@@ -125,7 +125,7 @@ class ArrayBuffer[A](override protected val initialSize: Int)
* the buffer size.
*
* @param n the index which refers to the first element to delete.
- * @param count the number of elemenets to delete
+ * @param count the number of elements to delete
* @throws Predef.IndexOutOfBoundsException if <code>n</code> is out of bounds.
*/
override def remove(n: Int, count: Int) {
diff --git a/src/library/scala/collection/mutable/ArrayBuilder.scala b/src/library/scala/collection/mutable/ArrayBuilder.scala
index c824cc15f5..7e4bab353b 100644
--- a/src/library/scala/collection/mutable/ArrayBuilder.scala
+++ b/src/library/scala/collection/mutable/ArrayBuilder.scala
@@ -67,7 +67,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[T]): this.type = (xs: AnyRef) match {
+ override def ++=(xs: TraversableOnce[T]): this.type = (xs: AnyRef) match {
case xs: WrappedArray.ofRef[_] =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -131,7 +131,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Byte]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Byte]): this.type = xs match {
case xs: WrappedArray.ofByte =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -195,7 +195,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Short]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Short]): this.type = xs match {
case xs: WrappedArray.ofShort =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -259,7 +259,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Char]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Char]): this.type = xs match {
case xs: WrappedArray.ofChar =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -323,7 +323,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Int]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Int]): this.type = xs match {
case xs: WrappedArray.ofInt =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -387,7 +387,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Long]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Long]): this.type = xs match {
case xs: WrappedArray.ofLong =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -451,7 +451,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Float]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Float]): this.type = xs match {
case xs: WrappedArray.ofFloat =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -515,7 +515,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Double]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Double]): this.type = xs match {
case xs: WrappedArray.ofDouble =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -579,7 +579,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Boolean]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Boolean]): this.type = xs match {
case xs: WrappedArray.ofBoolean =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
@@ -643,7 +643,7 @@ object ArrayBuilder {
this
}
- override def ++=(xs: scala.collection.Traversable[Unit]): this.type = xs match {
+ override def ++=(xs: TraversableOnce[Unit]): this.type = xs match {
case xs: WrappedArray.ofUnit =>
ensureSize(this.size + xs.length)
Array.copy(xs.array, 0, elems, this.size, xs.length)
diff --git a/src/library/scala/collection/mutable/ArrayLike.scala b/src/library/scala/collection/mutable/ArrayLike.scala
index c26f333afb..0b64f1255e 100644
--- a/src/library/scala/collection/mutable/ArrayLike.scala
+++ b/src/library/scala/collection/mutable/ArrayLike.scala
@@ -18,7 +18,7 @@ import generic._
*
* @since 2.8
*/
-trait ArrayLike[A, +Repr] extends IndexedSeqLike[A, Repr] { self =>
+trait ArrayLike[A, +Repr] extends IndexedSeqOptimized[A, Repr] { self =>
/** Creates a possible nested IndexedSeq which consists of all the elements
* of this array. If the elements are arrays themselves, the `deep' transformation
diff --git a/src/library/scala/collection/mutable/ArrayOps.scala b/src/library/scala/collection/mutable/ArrayOps.scala
index 61fcc77e14..553461c805 100644
--- a/src/library/scala/collection/mutable/ArrayOps.scala
+++ b/src/library/scala/collection/mutable/ArrayOps.scala
@@ -24,6 +24,12 @@ abstract class ArrayOps[T] extends ArrayLike[T, Array[T]] {
ClassManifest.fromClass(
repr.getClass.getComponentType.getComponentType.asInstanceOf[Predef.Class[U]]))
+ override def toArray[U >: T : ClassManifest]: Array[U] =
+ if (implicitly[ClassManifest[U]].erasure eq repr.getClass.getComponentType)
+ repr.asInstanceOf[Array[U]]
+ else
+ super.toArray[U]
+
/** Flattens a two-dimensional array by concatenating all its rows
* into a single array
*/
diff --git a/src/library/scala/collection/mutable/GenericArray.scala b/src/library/scala/collection/mutable/ArraySeq.scala
index 4aecf48585..f6f958601d 100644
--- a/src/library/scala/collection/mutable/GenericArray.scala
+++ b/src/library/scala/collection/mutable/ArraySeq.scala
@@ -22,12 +22,12 @@ import generic._
* @version 2.8
* @since 2.8
*/
-class GenericArray[A](override val length: Int)
+class ArraySeq[A](override val length: Int)
extends IndexedSeq[A]
- with GenericTraversableTemplate[A, GenericArray]
- with IndexedSeqLike[A, GenericArray[A]] {
+ with GenericTraversableTemplate[A, ArraySeq]
+ with IndexedSeqOptimized[A, ArraySeq[A]] {
- override def companion: GenericCompanion[GenericArray] = GenericArray
+ override def companion: GenericCompanion[ArraySeq] = ArraySeq
val array: Array[AnyRef] = new Array[AnyRef](length)
@@ -64,11 +64,11 @@ extends IndexedSeq[A]
}
}
-object GenericArray extends SeqFactory[GenericArray] {
- implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, GenericArray[A]] = new GenericCanBuildFrom[A]
- def newBuilder[A]: Builder[A, GenericArray[A]] =
+object ArraySeq extends SeqFactory[ArraySeq] {
+ implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, ArraySeq[A]] = new GenericCanBuildFrom[A]
+ def newBuilder[A]: Builder[A, ArraySeq[A]] =
new ArrayBuffer[A] mapResult { buf =>
- val result = new GenericArray[A](buf.length)
+ val result = new ArraySeq[A](buf.length)
buf.copyToArray(result.array.asInstanceOf[Array[Any]], 0)
result
}
diff --git a/src/library/scala/collection/mutable/ArrayStack.scala b/src/library/scala/collection/mutable/ArrayStack.scala
index 1c3bdacaa5..8f9d1bfc88 100644
--- a/src/library/scala/collection/mutable/ArrayStack.scala
+++ b/src/library/scala/collection/mutable/ArrayStack.scala
@@ -104,15 +104,7 @@ class ArrayStack[T] private(private var table : Array[AnyRef],
*
* @param x The source of elements to push
*/
- def ++=(x: scala.collection.Iterable[T]): this.type = { x.foreach(this +=(_)); this }
-
-
- /**
- * Pushes all the provided elements onto the stack.
- *
- * @param x The source of elements to push
- */
- def ++=(x: Iterator[T]): this.type = { x.foreach(this +=(_)); this }
+ def ++=(xs: TraversableOnce[T]): this.type = { xs foreach += ; this }
/**
* Alias for push.
diff --git a/src/library/scala/collection/mutable/BufferLike.scala b/src/library/scala/collection/mutable/BufferLike.scala
index 3516a60233..0fb34cc8e0 100644
--- a/src/library/scala/collection/mutable/BufferLike.scala
+++ b/src/library/scala/collection/mutable/BufferLike.scala
@@ -14,6 +14,7 @@ package mutable
import generic._
import script._
+import annotation.migration
/** A template trait for buffers of type `Buffer[A]`.
*
@@ -30,10 +31,6 @@ import script._
* @author Martin Odersky
* @author Matthias Zenger
* @version 2.8
- * @since 2.8
- * @author Matthias Zenger
- * @author Martin Odersky
- * @version 2.8
* @since 2.8
* @define buffernote @note
* This trait provides most of the operations of a `Buffer` independently of its representation.
@@ -63,12 +60,14 @@ trait BufferLike[A, +This <: BufferLike[A, This] with Buffer[A]]
extends Growable[A]
with Shrinkable[A]
with Scriptable[A]
- with Addable[A, This]
with Subtractable[A, This]
with Cloneable[This]
with SeqLike[A, This]
{ self : This =>
+ // Note this does not extend Addable because `+` is being phased out of
+ // all Seq-derived classes.
+
import scala.collection.{Iterable, Traversable}
// Abstract methods from IndexedSeq:
@@ -132,50 +131,33 @@ trait BufferLike[A, +This <: BufferLike[A, This] with Buffer[A]]
this
}
- /** Prepends the elements contained in a traversable collection
- * to this buffer.
- * @param elems the collection containing the elements to prepend.
- * @return the buffer itself.
- */
- def ++=:(elems: Traversable[A]): this.type = { insertAll(0, elems); this }
-
- /** Prepends the elements produced by an iterator to this buffer.
+ /** Prepends elements to this buffer.
*
- * @param iter the iterator producing the elements to prepend.
- * @return the buffer itself.
+ * @param xs the TraversableOnce containing the elements to prepend.
+ * @return the buffer itself.
*/
- def ++=:(iter: Iterator[A]): this.type = { insertAll(0, iter.toSeq); this }
+ def ++=:(xs: TraversableOnce[A]): this.type = { insertAll(0, xs.toTraversable); this }
/** Appends the given elements to this buffer.
*
* @param elems the elements to append.
*/
- def append(elems: A*) { this ++= elems }
+ def append(elems: A*) { appendAll(elems) }
/** Appends the elements contained in a traversable collection to this buffer.
* @param elems the collection containing the elements to append.
*/
- def appendAll(elems: Traversable[A]) { this ++= elems }
-
- /** Appends the elements produced by an iterator to this buffer.
- * @param elems the iterator producing the elements to append.
- */
- def appendAll(iter: Iterator[A]) { this ++= iter }
+ def appendAll(xs: TraversableOnce[A]) { this ++= xs }
/** Prepends given elements to this buffer.
* @param elems the elements to prepend.
*/
- def prepend(elems: A*) { elems ++=: this }
+ def prepend(elems: A*) { prependAll(elems) }
/** Prepends the elements contained in a traversable collection to this buffer.
* @param elems the collection containing the elements to prepend.
*/
- def prependAll(iter: Traversable[A]) { iter ++=: this }
-
- /** Prepends a number of elements produced by an iterator to this buffer.
- * @param iter the iterator producing the elements to prepend.
- */
- def prependAll(iter: Iterator[A]) { iter ++=: this }
+ def prependAll(xs: TraversableOnce[A]) { xs ++=: this }
/** Inserts new elements at a given index into this buffer.
*
@@ -230,7 +212,7 @@ trait BufferLike[A, +This <: BufferLike[A, This] with Buffer[A]]
*/
override def stringPrefix: String = "Buffer"
- /** Provide a read-only view of this byffer as a sequence
+ /** Provide a read-only view of this buffer as a sequence
* @return A sequence which refers to this buffer for all its operations.
*/
def readOnly: scala.collection.Seq[A] = toSeq
@@ -254,25 +236,35 @@ trait BufferLike[A, +This <: BufferLike[A, This] with Buffer[A]]
@deprecated("use ++=: instead")
final def ++:(iter: Traversable[A]): This = ++=:(iter)
+ @deprecated("use `+=:' instead")
+ final def +:(elem: A): This = +=:(elem)
+
/** Adds a single element to this collection and returns
- * the collection itself.
+ * the collection itself. Note that for backward compatibility
+ * reasons, this method mutates the collection in place, unlike
+ * similar but undeprecated methods throughout the collections
+ * hierarchy. You are strongly recommended to use '+=' instead.
*
* @param elem the element to add.
*/
@deprecated("Use += instead if you intend to add by side effect to an existing collection.\n"+
- "Use `clone() ++=' if you intend to create a new collection.")
- override def + (elem: A): This = { +=(elem); repr }
+ "Use `clone() +=' if you intend to create a new collection.")
+ def + (elem: A): This = { +=(elem); repr }
/** Adds two or more elements to this collection and returns
- * the collection itself.
+ * the collection itself. Note that for backward compatibility
+ * reasons, this method mutates the collection in place, unlike
+ * all similar methods throughout the collections hierarchy.
+ * similar but undeprecated methods throughout the collections
+ * hierarchy. You are strongly recommended to use '++=' instead.
*
* @param elem1 the first element to add.
* @param elem2 the second element to add.
* @param elems the remaining elements to add.
*/
- @deprecated("Use += instead if you intend to add by side effect to an existing collection.\n"+
+ @deprecated("Use ++= instead if you intend to add by side effect to an existing collection.\n"+
"Use `clone() ++=' if you intend to create a new collection.")
- override def + (elem1: A, elem2: A, elems: A*): This = {
+ def + (elem1: A, elem2: A, elems: A*): This = {
this += elem1 += elem2 ++= elems
repr
}
@@ -282,33 +274,22 @@ trait BufferLike[A, +This <: BufferLike[A, This] with Buffer[A]]
*
* @param iter the iterable object.
*/
- @deprecated("Use ++= instead if you intend to add by side effect to an existing collection.\n"+
- "Use `clone() ++=` if you intend to create a new collection.")
- override def ++(iter: Traversable[A]): This = {
- for (elem <- iter) +=(elem)
- repr
- }
-
- /** Adds a number of elements provided by an iterator and returns
- * the collection itself.
- *
- * @param iter the iterator
- */
- @deprecated("Use ++= instead if you intend to add by side effect to an existing collection.\n"+
- "Use `clone() ++=` if you intend to create a new collection.")
- override def ++ (iter: Iterator[A]): This = {
- for (elem <- iter) +=(elem)
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, ++ always creates a new collection, even on Buffers.\n"+
+ "Use ++= instead if you intend to add by side effect to an existing collection.\n"
+ )
+ def ++(xs: TraversableOnce[A]): This = clone() ++= xs
/** Removes a single element from this collection and returns
* the collection itself.
*
* @param elem the element to remove.
*/
- @deprecated("Use -= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() -=` if you intend to create a new collection.")
- override def -(elem: A): This = { -=(elem); repr }
+ @migration(2, 8,
+ "As of 2.8, - always creates a new collection, even on Buffers.\n"+
+ "Use -= instead if you intend to remove by side effect from an existing collection.\n"
+ )
+ override def -(elem: A): This = clone() -= elem
/** Removes two or more elements from this collection and returns
* the collection itself.
@@ -317,40 +298,20 @@ trait BufferLike[A, +This <: BufferLike[A, This] with Buffer[A]]
* @param elem2 the second element to remove.
* @param elems the remaining elements to remove.
*/
- @deprecated("Use -= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() -=` if you intend to create a new collection.")
- override def -(elem1: A, elem2: A, elems: A*): This = {
- this -= elem1 -= elem2 --= elems
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, - always creates a new collection, even on Buffers.\n"+
+ "Use -= instead if you intend to remove by side effect from an existing collection.\n"
+ )
+ override def -(elem1: A, elem2: A, elems: A*): This = clone() -= elem1 -= elem2 --= elems
/** Removes a number of elements provided by a Traversable object and returns
* the collection itself.
*
* @param iter the Traversable object.
*/
- @deprecated("Use --= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() --=` if you intend to create a new collection.")
- override def --(iter: Traversable[A]): This = {
- for (elem <- iter) -=(elem)
- repr
- }
-
- @deprecated("use `+=:' instead")
- final def +:(elem: A): This = +=:(elem)
-
- /** Removes a number of elements provided by an iterator and returns
- * the collection itself.
- *
- * @param iter the iterator
- */
- @deprecated("Use --= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() --=` if you intend to create a new collection.")
- override def --(iter: Iterator[A]): This = {
- for (elem <- iter) -=(elem)
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, -- always creates a new collection, even on Buffers.\n"+
+ "Use --= instead if you intend to remove by side effect from an existing collection.\n"
+ )
+ override def --(xs: TraversableOnce[A]): This = clone() --= xs
}
-
-
-
diff --git a/src/library/scala/collection/mutable/BufferProxy.scala b/src/library/scala/collection/mutable/BufferProxy.scala
index cc8aba79ab..d4444dab67 100644
--- a/src/library/scala/collection/mutable/BufferProxy.scala
+++ b/src/library/scala/collection/mutable/BufferProxy.scala
@@ -61,14 +61,14 @@ trait BufferProxy[A] extends Buffer[A] with Proxy {
*/
@deprecated("Use ++= instead if you intend to add by side effect to an existing collection.\n"+
"Use `clone() ++=` if you intend to create a new collection.")
- def ++(iter: scala.collection.Iterable[A]): Buffer[A] = self.++(iter)
+ override def ++(xs: TraversableOnce[A]): Buffer[A] = self.++(xs)
/** Appends a number of elements provided by an iterable object
* via its <code>iterator</code> method.
*
* @param iter the iterable object.
*/
- def ++=(iter: scala.collection.Iterable[A]): this.type = { self.++=(iter); this }
+ override def ++=(xs: TraversableOnce[A]): this.type = { self.++=(xs); this }
/** Appends a sequence of elements to this buffer.
*
@@ -81,7 +81,7 @@ trait BufferProxy[A] extends Buffer[A] with Proxy {
*
* @param iter the iterable object.
*/
- def appendAll(iter: scala.collection.Iterable[A]) { self.appendAll(iter) }
+ override def appendAll(xs: TraversableOnce[A]) { self.appendAll(xs) }
/** Prepend a single element to this buffer and return
* the identity of the buffer.
@@ -90,8 +90,7 @@ trait BufferProxy[A] extends Buffer[A] with Proxy {
*/
def +=:(elem: A): this.type = { self.+=:(elem); this }
- override def ++=:(iter: scala.collection.Traversable[A]): this.type = { self.++=:(iter); this }
- override def ++=:(iter: scala.collection.Iterator[A]): this.type = { self.++=:(iter); this }
+ override def ++=:(xs: TraversableOnce[A]): this.type = { self.++=:(xs); this }
/** Prepend an element to this list.
*
@@ -105,7 +104,7 @@ trait BufferProxy[A] extends Buffer[A] with Proxy {
*
* @param iter the iterable object.
*/
- def prependAll(elems: scala.collection.Iterable[A]) { self.prependAll(elems) }
+ override def prependAll(xs: TraversableOnce[A]) { self.prependAll(xs) }
/** Inserts new elements at the index <code>n</code>. Opposed to method
* <code>update</code>, this method will not replace an element with a
diff --git a/src/library/scala/collection/mutable/Builder.scala b/src/library/scala/collection/mutable/Builder.scala
index c7932ae344..2e6a9149bc 100644
--- a/src/library/scala/collection/mutable/Builder.scala
+++ b/src/library/scala/collection/mutable/Builder.scala
@@ -48,7 +48,7 @@ trait Builder[-Elem, +To] extends Growable[Elem] {
* builder implementations are still required to work correctly even if the hint is
* wrong, i.e. a different number of elements is added.
*
- * @size the hint how many elements will be added.
+ * @param size the hint how many elements will be added.
*/
def sizeHint(size: Int) {}
@@ -64,8 +64,7 @@ trait Builder[-Elem, +To] extends Growable[Elem] {
val self = Builder.this
def +=(x: Elem): this.type = { self += x; this }
def clear() = self.clear()
- override def ++=(xs: Iterator[Elem]): this.type = { self ++= xs; this }
- override def ++=(xs:scala.collection.Traversable[Elem]): this.type = { self ++= xs; this }
+ override def ++=(xs: TraversableOnce[Elem]): this.type = { self ++= xs; this }
def result: NewTo = f(self.result)
}
}
diff --git a/src/library/scala/collection/mutable/ConcurrentMap.scala b/src/library/scala/collection/mutable/ConcurrentMap.scala
index d09bf57e1b..2cfa4f8ae2 100644
--- a/src/library/scala/collection/mutable/ConcurrentMap.scala
+++ b/src/library/scala/collection/mutable/ConcurrentMap.scala
@@ -1,9 +1,5 @@
-package scala.collection.mutable
-
-
-
-
-
+package scala.collection
+package mutable
/**
* A template trait for mutable maps that allow concurrent access.
@@ -66,14 +62,4 @@ trait ConcurrentMap[A, B] extends Map[A, B] {
* @return `Some(v)` if the given key was previously mapped to some value `v`, or `None` otherwise
*/
def replace(k: A, v: B): Option[B]
-
}
-
-
-
-
-
-
-
-
-
diff --git a/src/library/scala/collection/mutable/DoubleLinkedList.scala b/src/library/scala/collection/mutable/DoubleLinkedList.scala
index 8c50f739e1..718d6aa35d 100644
--- a/src/library/scala/collection/mutable/DoubleLinkedList.scala
+++ b/src/library/scala/collection/mutable/DoubleLinkedList.scala
@@ -40,7 +40,7 @@ class DoubleLinkedList[A]() extends LinearSeq[A]
}
object DoubleLinkedList extends SeqFactory[DoubleLinkedList] {
- implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, DoubleLinkedList[A]] = new GenericCanBuildFrom[A] //new CanBuildFrom[Coll, A, DoubleLinkedList[A]] { : Coll) = from.traversableBuilder[A] }
+ implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, DoubleLinkedList[A]] = new GenericCanBuildFrom[A]
def newBuilder[A]: Builder[A, DoubleLinkedList[A]] =
new Builder[A, DoubleLinkedList[A]] {
var current: DoubleLinkedList[A] = _
diff --git a/src/library/scala/collection/mutable/FlatHashTable.scala b/src/library/scala/collection/mutable/FlatHashTable.scala
index 0e73bf7fad..ea4033d405 100644
--- a/src/library/scala/collection/mutable/FlatHashTable.scala
+++ b/src/library/scala/collection/mutable/FlatHashTable.scala
@@ -11,7 +11,7 @@
package scala.collection
package mutable
-/**
+/** An implementation class backing a HashSet.
* @since 2.3
*/
trait FlatHashTable[A] {
@@ -46,7 +46,7 @@ trait FlatHashTable[A] {
private def initialCapacity = capacity(initialSize)
/**
- * Initialises the collection from the input stream. `f` will be called for each element
+ * Initializes the collection from the input stream. `f` will be called for each element
* read from the input stream in the order determined by the stream. This is useful for
* structures where iteration order is important (e.g. LinkedHashSet).
*
diff --git a/src/library/scala/collection/mutable/GrowingBuilder.scala b/src/library/scala/collection/mutable/GrowingBuilder.scala
new file mode 100644
index 0000000000..445e9d4f3e
--- /dev/null
+++ b/src/library/scala/collection/mutable/GrowingBuilder.scala
@@ -0,0 +1,30 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+package scala.collection
+package mutable
+
+import generic._
+
+/** The canonical builder for collections that are growable, i.e. that support an
+ * efficient `+=` method which adds an element to the collection. It is
+ * almost identical to AddingBuilder, but necessitated by the existence of
+ * classes which are Growable but not Addable, which is a result of covariance
+ * interacting surprisingly with any2stringadd thus driving '+' out of the Seq
+ * hierarchy. The tendrils of original sin should never be underestimated.
+ *
+ * @author Paul Phillips
+ * @version 2.8
+ * @since 2.8
+ */
+class GrowingBuilder[Elem, To <: Growable[Elem]](empty: To) extends Builder[Elem, To] {
+ protected var elems: To = empty
+ def +=(x: Elem): this.type = { elems += x; this }
+ def clear() { elems = empty }
+ def result: To = elems
+}
diff --git a/src/library/scala/collection/mutable/HashMap.scala b/src/library/scala/collection/mutable/HashMap.scala
index 658c574087..2b5cad37d8 100644
--- a/src/library/scala/collection/mutable/HashMap.scala
+++ b/src/library/scala/collection/mutable/HashMap.scala
@@ -67,7 +67,7 @@ class HashMap[A, B] extends Map[A, B]
}
/* Override to avoid tuple allocation in foreach */
- override def valuesIterable: collection.Iterable[B] = new DefaultValuesIterable {
+ override def values: collection.Iterable[B] = new DefaultValuesIterable {
override def foreach[C](f: B => C) = foreachEntry(e => f(e.value))
}
diff --git a/src/library/scala/collection/mutable/HashSet.scala b/src/library/scala/collection/mutable/HashSet.scala
index f1f2ed3274..e985e717b0 100644
--- a/src/library/scala/collection/mutable/HashSet.scala
+++ b/src/library/scala/collection/mutable/HashSet.scala
@@ -50,7 +50,7 @@ class HashSet[A] extends Set[A]
}
}
- override def clone(): Set[A] = new HashSet[A] ++= this
+ override def clone() = new HashSet[A] ++= this
private def writeObject(s: java.io.ObjectOutputStream) {
serializeTo(s)
diff --git a/src/library/scala/collection/mutable/HashTable.scala b/src/library/scala/collection/mutable/HashTable.scala
index aa7993ed14..14f1720a4c 100644
--- a/src/library/scala/collection/mutable/HashTable.scala
+++ b/src/library/scala/collection/mutable/HashTable.scala
@@ -65,7 +65,7 @@ trait HashTable[A] {
private def initialCapacity = capacity(initialSize)
/**
- * Initialises the collection from the input stream. `f` will be called for each key/value pair
+ * Initializes the collection from the input stream. `f` will be called for each key/value pair
* read from the input stream in the order determined by the stream. This is useful for
* structures where iteration order is important (e.g. LinkedHashMap).
*/
diff --git a/src/library/scala/collection/mutable/ImmutableMapAdaptor.scala b/src/library/scala/collection/mutable/ImmutableMapAdaptor.scala
index d6b3115b81..fba28e7c2a 100644
--- a/src/library/scala/collection/mutable/ImmutableMapAdaptor.scala
+++ b/src/library/scala/collection/mutable/ImmutableMapAdaptor.scala
@@ -12,6 +12,7 @@
package scala.collection
package mutable
+import annotation.migration
/** This class can be used as an adaptor to create mutable maps from
* immutable map implementations. Only method <code>empty</code> has
@@ -41,19 +42,17 @@ extends Map[A, B]
override def isDefinedAt(key: A) = imap.isDefinedAt(key)
- override def keySet: scala.collection.Set[A] = imap.keySet
+ override def keySet: collection.Set[A] = imap.keySet
override def keysIterator: Iterator[A] = imap.keysIterator
- @deprecated("use `keysIterator' instead")
- override def keys: Iterator[A] = imap.keysIterator
-
- override def valuesIterable: scala.collection.Iterable[B] = imap.valuesIterable
+ @migration(2, 8, "As of 2.8, keys returns Iterable[A] rather than Iterator[A].")
+ override def keys: collection.Iterable[A] = imap.keys
override def valuesIterator: Iterator[B] = imap.valuesIterator
- @deprecated("use `valuesIterator' instead")
- override def values: Iterator[B] = imap.valuesIterator
+ @migration(2, 8, "As of 2.8, values returns Iterable[B] rather than Iterator[B].")
+ override def values: collection.Iterable[B] = imap.values
def iterator: Iterator[(A, B)] = imap.iterator
diff --git a/src/library/scala/collection/mutable/IndexedSeq.scala b/src/library/scala/collection/mutable/IndexedSeq.scala
index b11131e917..0a173395e0 100644
--- a/src/library/scala/collection/mutable/IndexedSeq.scala
+++ b/src/library/scala/collection/mutable/IndexedSeq.scala
@@ -16,6 +16,9 @@ import generic._
/** A subtrait of <code>collection.IndexedSeq</code> which represents sequences
* that can be mutated.
+ * $indexedSeqInfo
+ *
+ * @since 2.8
*/
trait IndexedSeq[A] extends Seq[A]
with scala.collection.IndexedSeq[A]
diff --git a/src/library/scala/collection/mutable/IndexedSeqOptimized.scala b/src/library/scala/collection/mutable/IndexedSeqOptimized.scala
new file mode 100755
index 0000000000..134cc2a8ea
--- /dev/null
+++ b/src/library/scala/collection/mutable/IndexedSeqOptimized.scala
@@ -0,0 +1,21 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+// $Id: IndexedSeqLike.scala 20129 2009-12-14 17:12:17Z odersky $
+
+
+package scala.collection
+package mutable
+import generic._
+
+/** A subtrait of scala.collection.IndexedSeq which represents sequences
+ * that can be mutated.
+ *
+ * @since 2.8
+ */
+trait IndexedSeqOptimized[A, +Repr] extends IndexedSeqLike[A, Repr] with scala.collection.IndexedSeqOptimized[A, Repr]
diff --git a/src/library/scala/collection/mutable/IndexedSeqView.scala b/src/library/scala/collection/mutable/IndexedSeqView.scala
index e864845455..d870b762d3 100644
--- a/src/library/scala/collection/mutable/IndexedSeqView.scala
+++ b/src/library/scala/collection/mutable/IndexedSeqView.scala
@@ -25,7 +25,10 @@ import TraversableView.NoBuilder
* @version 2.8
* @since 2.8
*/
-trait IndexedSeqView[A, +Coll] extends scala.collection.IndexedSeqView[A, Coll] {
+trait IndexedSeqView[A, +Coll] extends IndexedSeq[A]
+ with IndexedSeqOptimized[A, IndexedSeqView[A, Coll]]
+ with scala.collection.SeqView[A, Coll]
+ with scala.collection.SeqViewLike[A, Coll, IndexedSeqView[A, Coll]] {
self =>
def update(idx: Int, elem: A)
@@ -88,9 +91,22 @@ self =>
override def reverse: IndexedSeqView[A, Coll] = newReversed.asInstanceOf[IndexedSeqView[A, Coll]]
}
-/*
- * object IndexedSeqView {
- type Coll = TraversableView[_, C] forSome { type C <: scala.collection.Traversable[_] }
- implicit def canBuildFrom[A]: CanBuildFrom[IndexedSeq[_], A, IndexedSeqView[A], Coll] = new CanBuildFrom[mutable.IndexedSeq[_], A, IndexedSeqView[A], Coll] { : Coll) = new NoBuilder }
+/** $factoryInfo
+ * Note that the canBuildFrom factories yield SeqViews, not IndexedSewqViews.
+ * This is intentional, because not all operations yield again a mutable.IndexedSeqView.
+ * For instance, map just gives a SeqView, which reflects the fact that
+ * map cannot do its work and maintain a pointer into the original indexed sequence.
+ */
+object IndexedSeqView {
+ type Coll = TraversableView[_, C] forSome {type C <: Traversable[_]}
+ implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, SeqView[A, Seq[_]]] =
+ new CanBuildFrom[Coll, A, SeqView[A, Seq[_]]] {
+ def apply(from: Coll) = new NoBuilder
+ def apply() = new NoBuilder
+ }
+ implicit def arrCanBuildFrom[A]: CanBuildFrom[TraversableView[_, Array[_]], A, SeqView[A, Array[A]]] =
+ new CanBuildFrom[TraversableView[_, Array[_]], A, SeqView[A, Array[A]]] {
+ def apply(from: TraversableView[_, Array[_]]) = new NoBuilder
+ def apply() = new NoBuilder
+ }
}
-*/
diff --git a/src/library/scala/collection/mutable/LazyBuilder.scala b/src/library/scala/collection/mutable/LazyBuilder.scala
index 2ce1fa9827..7714d29f08 100644
--- a/src/library/scala/collection/mutable/LazyBuilder.scala
+++ b/src/library/scala/collection/mutable/LazyBuilder.scala
@@ -20,10 +20,9 @@ import immutable.{List, Nil}
*/
abstract class LazyBuilder[Elem, +To] extends Builder[Elem, To] {
/** The different segments of elements to be added to the builder, represented as iterators */
- protected var parts = new ListBuffer[scala.collection.Traversable[Elem]]
+ protected var parts = new ListBuffer[TraversableOnce[Elem]]
def +=(x: Elem): this.type = { parts += List(x); this }
- override def ++=(xs: Iterator[Elem]): this.type = { parts += xs.toStream; this }
- override def ++=(xs: scala.collection.Traversable[Elem]): this.type = { parts += xs; this }
+ override def ++=(xs: TraversableOnce[Elem]): this.type = { parts += xs ; this }
def result(): To
def clear() { parts.clear() }
}
diff --git a/src/library/scala/collection/mutable/LinearSeq.scala b/src/library/scala/collection/mutable/LinearSeq.scala
index 25d7ef6be8..9abaef5aff 100644
--- a/src/library/scala/collection/mutable/LinearSeq.scala
+++ b/src/library/scala/collection/mutable/LinearSeq.scala
@@ -14,8 +14,9 @@ package mutable
import generic._
-/** A subtrait of <code>collection.Seq</code> which represents sequences
- * that cannot be mutated.
+/** A subtrait of <code>collection.LinearSeq</code> which represents sequences
+ * that can be mutated.
+ * $linearSeqInfo
*
* @since 2.8
*/
diff --git a/src/library/scala/collection/mutable/LinkedListLike.scala b/src/library/scala/collection/mutable/LinkedListLike.scala
index c363609762..2523ece370 100644
--- a/src/library/scala/collection/mutable/LinkedListLike.scala
+++ b/src/library/scala/collection/mutable/LinkedListLike.scala
@@ -55,7 +55,7 @@ trait LinkedListLike[A, This <: Seq[A] with LinkedListLike[A, This]] extends Seq
}
/** Insert linked list `that` at current position of this linked list
- * @pre this linked list is not empty
+ * @note this linked list must not be empty
*/
def insert(that: This): Unit = {
require(nonEmpty, "insert into empty list")
diff --git a/src/library/scala/collection/mutable/ListBuffer.scala b/src/library/scala/collection/mutable/ListBuffer.scala
index 686b1acf8d..b8e5aeb262 100644
--- a/src/library/scala/collection/mutable/ListBuffer.scala
+++ b/src/library/scala/collection/mutable/ListBuffer.scala
@@ -231,7 +231,7 @@ final class ListBuffer[A]
*
* @param n the index which refers to the element to delete.
* @return n the element that was formerly at position <code>n</code>.
- * @pre an element exists at position <code>n</code>
+ * @note an element must exists at position <code>n</code>
* @throws Predef.IndexOutOfBoundsException if <code>n</code> is out of bounds.
*/
def remove(n: Int): A = {
@@ -335,5 +335,5 @@ final class ListBuffer[A]
*/
object ListBuffer extends SeqFactory[ListBuffer] {
implicit def canBuildFrom[A]: CanBuildFrom[Coll, A, ListBuffer[A]] = new GenericCanBuildFrom[A]
- def newBuilder[A]: Builder[A, ListBuffer[A]] = new AddingBuilder(new ListBuffer[A])
+ def newBuilder[A]: Builder[A, ListBuffer[A]] = new GrowingBuilder(new ListBuffer[A])
}
diff --git a/src/library/scala/collection/mutable/ListMap.scala b/src/library/scala/collection/mutable/ListMap.scala
index 82cc6340c0..c96873c81d 100644
--- a/src/library/scala/collection/mutable/ListMap.scala
+++ b/src/library/scala/collection/mutable/ListMap.scala
@@ -14,7 +14,7 @@ package mutable
import generic._
-/**
+/** A simple map backed by a list.
* @since 2.8
*/
@serializable
diff --git a/src/library/scala/collection/mutable/MapLike.scala b/src/library/scala/collection/mutable/MapLike.scala
index a1bb25910a..9c3c1c0e5f 100644
--- a/src/library/scala/collection/mutable/MapLike.scala
+++ b/src/library/scala/collection/mutable/MapLike.scala
@@ -13,6 +13,7 @@ package scala.collection
package mutable
import generic._
+import annotation.migration
/** A template trait for mutable maps of type `mutable.Map[A, B]` which
* associate keys of type `A` with values of type `B`.
@@ -41,7 +42,7 @@ import generic._
* {{{
* def> empty: This
* }}}
- * If you wish to avoid the unncessary construction of an `Option`
+ * If you wish to avoid the unnecessary construction of an `Option`
* object, you could also override `apply`, `update`,
* and `delete`.
@@ -51,7 +52,7 @@ import generic._
* @define Coll mutable.Map
*/
trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
- extends MapLikeBase[A, B, This]
+ extends scala.collection.MapLike[A, B, This]
with Builder[(A, B), This]
with Growable[(A, B)]
with Shrinkable[A]
@@ -101,23 +102,24 @@ trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
def += (kv: (A, B)): this.type
/** Creates a new map consisting of all key/value pairs of the current map
- * plus a new pair of a guven key and value.
+ * plus a new pair of a given key and value.
*
* @param key The key to add
* @param value The new value
* @return A fresh immutable map with the binding from `key` to
* `value` added to this map.
*/
- override def updated[B1 >: B](key: A, value: B1): mutable.Map[A, B1] = this + ((key, value))
+ override def updated[B1 >: B](key: A, value: B1): Map[A, B1] = this + ((key, value))
/** Add a new key/value mapping and return the map itself.
*
* @param kv the key/value mapping to be added
*/
- @deprecated("This operation will create a new map in the future. To add an element as a side\n"+
- "effect to an existing map and return that map itself, use +=. If you do want\n"+
- "to create a fresh map, you can use `clone() +=' to avoid a @deprecated warning.")
- def + (kv: (A, B)): this.type = { update(kv._1, kv._2); this }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new map. To add an element as a\n"+
+ "side effect to an existing map and return that map itself, use +=."
+ )
+ def + [B1 >: B] (kv: (A, B1)): Map[A, B1] = clone().asInstanceOf[Map[A, B1]] += kv
/** Adds two or more key/value mappings and return the map itself.
* with the added elements.
@@ -126,11 +128,12 @@ trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
* @param elem2 the second element to add.
* @param elems the remaining elements to add.
*/
- @deprecated("This operation will create a new map in the future. To add an element as a side\n"+
- "effect to an existing map and return that map itself, use +=. If you do want to\n"+
- "create a fresh map, you can use `clone() +=` to avoid a @deprecated warning.")
- def +(elem1: (A, B), elem2: (A, B), elems: (A, B)*): this.type =
- this += elem1 += elem2 ++= elems
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new map. To add an element as a\n"+
+ "side effect to an existing map and return that map itself, use +=."
+ )
+ override def + [B1 >: B] (elem1: (A, B1), elem2: (A, B1), elems: (A, B1) *): Map[A, B1] =
+ clone().asInstanceOf[Map[A, B1]] += elem1 += elem2 ++= elems
/** Adds a number of elements provided by a traversable object
* via its `iterator` method and returns
@@ -139,21 +142,12 @@ trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
*
* @param iter the traversable object.
*/
- @deprecated("This operation will create a new map in the future. To add elements as a side\n"+
- "effect to an existing map and return that map itself, use ++=. If you do want\n"+
- "to create a fresh map, you can use `clone() ++=` to avoid a @deprecated warning.")
- def ++(iter: Traversable[(A, B)]): this.type = { for (elem <- iter) +=(elem); this }
-
- /** Adds a number of elements provided by an iterator
- * via its `iterator` method and returns
- * the collection itself.
- *
- * @param iter the iterator
- */
- @deprecated("This operation will create a new map in the future. To add elements as a side\n"+
- "effect to an existing map and return that map itself, use ++=. If you do want\n"+
- "to create a fresh map, you can use `clone() +=` to avoid a @deprecated warning.")
- def ++(iter: Iterator[(A, B)]): this.type = { for (elem <- iter) +=(elem); this }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new map. To add the elements as a\n"+
+ "side effect to an existing map and return that map itself, use ++=."
+ )
+ override def ++[B1 >: B](xs: TraversableOnce[(A, B1)]): Map[A, B1] =
+ clone().asInstanceOf[Map[A, B1]] ++= xs
/** Removes a key from this map, returning the value associated previously
* with that key as an option.
@@ -176,30 +170,31 @@ trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
/** Delete a key from this map if it is present and return the map itself.
* @param key the key to be removed
*/
- @deprecated("This operation will create a new map in the future. To add elements as a side\n"+
- "effect to an existing map and return that map itself, use -=. If you do want\n"+
- "to create a fresh map, you can use `clone() -=` to avoid a @deprecated warning.")
- override def -(key: A): This = { -=(key); repr }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new map. To remove an element as a\n"+
+ "side effect to an existing map and return that map itself, use -=."
+ )
+ override def -(key: A): This = clone() -= key
/** If given key is defined in this map, remove it and return associated value as an Option.
* If key is not present return None.
* @param key the key to be removed
*/
- @deprecated("Use `remove' instead") def removeKey(key: A): Option[B] = remove(key)
+ @deprecated("Use `remove' instead") def removeKey(key: A): Option[B] = remove(key)
/** Removes all bindings from the map. After this operation has completed,
* the map will be empty.
*/
- def clear() { for ((k, v) <- this.iterator) -=(k) }
+ def clear() { keysIterator foreach -= }
/** If given key is already in this map, returns associated value
* Otherwise, computes value from given expression `op`, stores with key
* in map and returns that value.
- * @param the key to test
- * @param the computation yielding the value to associate with `key`, if
- * `key` is previosuly unbound.
- * @return the value associated with key (either previously or as a result
- * of executing the method).
+ * @param key the key to test
+ * @param op the computation yielding the value to associate with `key`, if
+ * `key` is previously unbound.
+ * @return the value associated with key (either previously or as a result
+ * of executing the method).
*/
def getOrElseUpdate(key: A, op: => B): B =
get(key) match {
@@ -209,7 +204,7 @@ trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
/** Applies a transformation function to all values contained in this map.
* The transformation function produces new values from existing keys
- * asssociated values.
+ * associated values.
*
* @param f the transformation to apply
* @return the map itself.
@@ -232,8 +227,7 @@ trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
this
}
- override def clone(): This =
- empty ++= repr
+ override def clone(): This = empty ++= repr
/** The result when this map is used as a builder
* @return the map representation itself.
@@ -247,35 +241,21 @@ trait MapLike[A, B, +This <: MapLike[A, B, This] with Map[A, B]]
* @param elem2 the second element to remove.
* @param elems the remaining elements to remove.
*/
- @deprecated("Use -= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() -=' if you intend to create a new collection.")
- override def -(elem1: A, elem2: A, elems: A*): This = {
- this -= elem1 -= elem2 --= elems
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new map. To remove an element as a\n"+
+ "side effect to an existing map and return that map itself, use -=."
+ )
+ override def -(elem1: A, elem2: A, elems: A*): This =
+ clone() -= elem1 -= elem2 --= elems
/** Removes a number of elements provided by a Traversable object and returns
* the collection itself.
*
* @param iter the Traversable object.
*/
- @deprecated("Use --= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() --=' if you intend to create a new collection.")
- override def --(iter: Traversable[A]): This = {
- for (elem <- iter) -=(elem)
- repr
- }
-
-
- /** Removes a number of elements provided by an iterator and returns
- * the collection itself.
- *
- * @param iter the iterator
- */
- @deprecated("Use --= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() --=' if you intend to create a new collection.")
- override def --(iter: Iterator[A]): This = {
- for (elem <- iter) -=(elem)
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new map. To remove the elements as a\n"+
+ "side effect to an existing map and return that map itself, use --=."
+ )
+ override def --(xs: TraversableOnce[A]): This = clone() --= xs
}
diff --git a/src/library/scala/collection/mutable/MapLikeBase.scala b/src/library/scala/collection/mutable/MapLikeBase.scala
deleted file mode 100644
index 402df79d84..0000000000
--- a/src/library/scala/collection/mutable/MapLikeBase.scala
+++ /dev/null
@@ -1,37 +0,0 @@
-/* __ *\
-** ________ ___ / / ___ Scala API **
-** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
-** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
-** /____/\___/_/ |_/____/_/ | | **
-** |/ **
-\* */
-
-// $Id$
-
-
-package scala.collection
-package mutable
-
-/** The reason for this class is so that we can
- * have both a generic immutable `+` with signature
- *
- * def + [B1 >: B](kv: (A, B1)): Map[A, B1]
- *
- * and a (deprecated) mutable `+` of signature
- *
- * def + (kv: (A, B)): this.type = this += kv
- *
- * The former is required to fulfill the Map contract.
- * The latter is required for backwards compatibility.
- * We can't have both methods in the same class, as that would give a double definition.
- * They are OK in different classes though, and narrowly escape a `same erasure' problem.
- * Once the deprecated + goes away we can do without class MapLikeBase.
- *
- * @author Martin Odersky
- * @version 2.8
- * @since 2.8
- */
-trait MapLikeBase[A, B, +This <: MapLikeBase[A, B, This] with Map[A, B]]
- extends scala.collection.MapLike[A, B, This] with Cloneable[This] {
- def + [B1 >: B] (kv: (A, B1)): mutable.Map[A, B1] = clone().asInstanceOf[mutable.Map[A, B1]] += kv
-}
diff --git a/src/library/scala/collection/mutable/MapProxy.scala b/src/library/scala/collection/mutable/MapProxy.scala
index 55e6eba1e3..cb768f6778 100644
--- a/src/library/scala/collection/mutable/MapProxy.scala
+++ b/src/library/scala/collection/mutable/MapProxy.scala
@@ -28,14 +28,16 @@ package mutable
trait MapProxy[A, B] extends Map[A, B] with MapProxyLike[A, B, Map[A, B]]
{
+ private def newProxy[B1 >: B](newSelf: Map[A, B1]): MapProxy[A, B1] =
+ new MapProxy[A, B1] { val self = newSelf }
+
override def repr = this
override def empty: MapProxy[A, B] = new MapProxy[A, B] { val self = MapProxy.this.self.empty }
- override def +(kv: (A, B)) = { self.update(kv._1, kv._2) ; this }
- override def + [B1 >: B] (elem1: (A, B1), elem2: (A, B1), elems: (A, B1) *) =
- { self.+(elem1, elem2, elems: _*) ; this }
+ override def + [B1 >: B] (kv: (A, B1)): Map[A, B1] = newProxy(self + kv)
+ override def + [B1 >: B] (elem1: (A, B1), elem2: (A, B1), elems: (A, B1) *) = newProxy(self.+(elem1, elem2, elems: _*))
- override def -(key: A) = { self.remove(key); this }
+ override def -(key: A) = newProxy(self - key)
override def += (kv: (A, B)) = { self += kv ; this }
override def -= (key: A) = { self -= key ; this }
diff --git a/src/library/scala/collection/mutable/MultiMap.scala b/src/library/scala/collection/mutable/MultiMap.scala
index e335500349..01ddea070c 100644
--- a/src/library/scala/collection/mutable/MultiMap.scala
+++ b/src/library/scala/collection/mutable/MultiMap.scala
@@ -25,6 +25,9 @@ package mutable
trait MultiMap[A, B] extends Map[A, Set[B]] {
protected def makeSet: Set[B] = new HashSet[B]
+ @deprecated("use addBinding instead")
+ def add(key: A, value: B): this.type = addBinding(key, value)
+
def addBinding(key: A, value: B): this.type = {
get(key) match {
case None =>
diff --git a/src/library/scala/collection/mutable/MutableList.scala b/src/library/scala/collection/mutable/MutableList.scala
index e6423aa677..7784927c87 100644
--- a/src/library/scala/collection/mutable/MutableList.scala
+++ b/src/library/scala/collection/mutable/MutableList.scala
@@ -29,7 +29,7 @@ import immutable.{List, Nil}
*/
@serializable @SerialVersionUID(5938451523372603072L)
class MutableList[A] extends LinearSeq[A]
- with LinearSeqLike[A, MutableList[A]]
+ with LinearSeqOptimized[A, MutableList[A]]
with Builder[A, MutableList[A]] {
override protected[this] def newBuilder = new MutableList[A]
diff --git a/src/library/scala/collection/mutable/OpenHashMap.scala b/src/library/scala/collection/mutable/OpenHashMap.scala
index 9f0d9d2c25..79bb96a0bf 100644
--- a/src/library/scala/collection/mutable/OpenHashMap.scala
+++ b/src/library/scala/collection/mutable/OpenHashMap.scala
@@ -15,7 +15,7 @@ package mutable
/**
* @since 2.7
*/
-object OpenHashMap{
+object OpenHashMap {
def apply[K, V](elems : (K, V)*) = {
val dict = new OpenHashMap[K, V];
elems.foreach({case (x, y) => dict(x) = y});
diff --git a/src/library/scala/collection/mutable/PriorityQueue.scala b/src/library/scala/collection/mutable/PriorityQueue.scala
index c4dac9effb..4d74a2ee74 100644
--- a/src/library/scala/collection/mutable/PriorityQueue.scala
+++ b/src/library/scala/collection/mutable/PriorityQueue.scala
@@ -13,7 +13,7 @@ package scala.collection
package mutable
import generic._
-
+import annotation.migration
/** This class implements priority queues using a heap.
* To prioritize elements of type T there must be an implicit
@@ -28,7 +28,6 @@ import generic._
class PriorityQueue[A](implicit ord: Ordering[A])
extends Seq[A]
with SeqLike[A, PriorityQueue[A]]
- with Addable[A, PriorityQueue[A]]
with Growable[A]
with Cloneable[PriorityQueue[A]]
with Builder[A, PriorityQueue[A]]
@@ -47,8 +46,8 @@ class PriorityQueue[A](implicit ord: Ordering[A])
private val resarr = new ResizableArrayAccess[A]
- resarr.p_size0 += 1 // we do not use array(0)
- override def length: Int = resarr.length - 1 // adjust length accordingly
+ resarr.p_size0 += 1 // we do not use array(0)
+ override def length: Int = resarr.length - 1 // adjust length accordingly
override def size: Int = length
override def isEmpty: Boolean = resarr.p_size0 < 2
override def repr = this
@@ -116,6 +115,23 @@ class PriorityQueue[A](implicit ord: Ordering[A])
}
}
+ @deprecated(
+ "Use += instead if you intend to add by side effect to an existing collection.\n"+
+ "Use `clone() +=' if you intend to create a new collection."
+ )
+ def +(elem: A): PriorityQueue[A] = { this.clone() += elem }
+
+ /** Add two or more elements to this set.
+ * @param elem1 the first element.
+ * @param kv2 the second element.
+ * @param kvs the remaining elements.
+ */
+ @deprecated(
+ "Use ++= instead if you intend to add by side effect to an existing collection.\n"+
+ "Use `clone() ++=' if you intend to create a new collection."
+ )
+ def +(elem1: A, elem2: A, elems: A*) = { this.clone().+=(elem1, elem2, elems : _*) }
+
/** Inserts a single element into the priority queue.
*
* @param elem the element to insert
@@ -128,27 +144,12 @@ class PriorityQueue[A](implicit ord: Ordering[A])
this
}
- def +(elem: A): PriorityQueue[A] = { this.clone() += elem }
-
- /** Add two or more elements to this set.
- * @param elem1 the first element.
- * @param kv2 the second element.
- * @param kvs the remaining elements.
- */
- override def +(elem1: A, elem2: A, elems: A*) = { this.clone().+=(elem1, elem2, elems : _*) }
-
/** Adds all elements provided by an <code>Iterable</code> object
* into the priority queue.
*
* @param iter an iterable object
*/
- override def ++(elems: scala.collection.Traversable[A]) = { this.clone() ++= elems }
-
- /** Adds all elements provided by an iterator into the priority queue.
- *
- * @param it an iterator
- */
- override def ++(iter: Iterator[A]) = { this.clone() ++= iter } // ...whereas this doesn't?
+ def ++(xs: TraversableOnce[A]) = { this.clone() ++= xs }
/** Adds all elements to the queue.
*
@@ -223,7 +224,7 @@ class PriorityQueue[A](implicit ord: Ordering[A])
}
override def reverseIterator = new Iterator[A] {
- val arr = new Array[Any](size)
+ val arr = new Array[Any](PriorityQueue.this.size)
iterator.copyToArray(arr)
var i = arr.size - 1
def hasNext: Boolean = i >= 0
@@ -267,9 +268,9 @@ class PriorityQueue[A](implicit ord: Ordering[A])
// }
}
-
-
-
-
-
-
+// !!! TODO - but no SortedSeqFactory (yet?)
+// object PriorityQueue extends SeqFactory[PriorityQueue] {
+// def empty[A](implicit ord: Ordering[A]): PriorityQueue[A] = new PriorityQueue[A](ord)
+// implicit def canBuildFrom[A](implicit ord: Ordering[A]): CanBuildFrom[Coll, A, PriorityQueue] =
+// }
+// \ No newline at end of file
diff --git a/src/library/scala/collection/mutable/PriorityQueueProxy.scala b/src/library/scala/collection/mutable/PriorityQueueProxy.scala
index 0771ce6b60..427ffe478a 100644
--- a/src/library/scala/collection/mutable/PriorityQueueProxy.scala
+++ b/src/library/scala/collection/mutable/PriorityQueueProxy.scala
@@ -48,21 +48,11 @@ abstract class PriorityQueueProxy[A](implicit ord: Ordering[A]) extends Priority
*/
override def +=(elem: A): this.type = { self += elem; this }
- /** Adds all elements provided by an <code>Iterable</code> object
- * into the priority queue.
- *
- * @param iter an iterable object
- */
- def ++=(iter: scala.collection.Iterable[A]): this.type = {
- self ++= iter
- this
- }
-
/** Adds all elements provided by an iterator into the priority queue.
*
* @param it an iterator
*/
- override def ++=(it: Iterator[A]): this.type = {
+ override def ++=(it: TraversableOnce[A]): this.type = {
self ++= it
this
}
diff --git a/src/library/scala/collection/mutable/Publisher.scala b/src/library/scala/collection/mutable/Publisher.scala
index 4f675eff9f..58e4394ef7 100644
--- a/src/library/scala/collection/mutable/Publisher.scala
+++ b/src/library/scala/collection/mutable/Publisher.scala
@@ -47,7 +47,7 @@ trait Publisher[Evt] {
def removeSubscriptions() { filters.clear }
protected def publish(event: Evt) {
- filters.keysIterator.foreach(sub =>
+ filters.keys.foreach(sub =>
if (!suspended.contains(sub) &&
filters.entryExists(sub, p => p(event)))
sub.notify(self, event)
diff --git a/src/library/scala/collection/mutable/Queue.scala b/src/library/scala/collection/mutable/Queue.scala
index 3b09ceba91..3754dbc3f2 100644
--- a/src/library/scala/collection/mutable/Queue.scala
+++ b/src/library/scala/collection/mutable/Queue.scala
@@ -24,7 +24,6 @@ import generic._
*/
@serializable @cloneable
class Queue[A] extends MutableList[A] with Cloneable[Queue[A]] {
-
/** Adds all elements to the queue.
*
* @param elems the elements to add.
@@ -144,3 +143,8 @@ class Queue[A] extends MutableList[A] with Cloneable[Queue[A]] {
*/
def front: A = first0.elem
}
+
+// !!! TODO - integrate
+object Queue {
+ def apply[A](xs: A*): Queue[A] = new Queue[A] ++= xs
+}
diff --git a/src/library/scala/collection/mutable/QueueProxy.scala b/src/library/scala/collection/mutable/QueueProxy.scala
index a322934f63..b2548b26cc 100644
--- a/src/library/scala/collection/mutable/QueueProxy.scala
+++ b/src/library/scala/collection/mutable/QueueProxy.scala
@@ -45,24 +45,13 @@ trait QueueProxy[A] extends Queue[A] with Proxy {
*/
override def +=(elem: A): this.type = { self += elem; this }
- /** Adds all elements provided by an <code>Iterable</code> object
- * at the end of the queue. The elements are prepended in the order they
- * are given out by the iterator.
- *
- * @param iter an iterable object
- */
- def ++=(iter: scala.collection.Iterable[A]): this.type = {
- self ++= iter
- this
- }
-
/** Adds all elements provided by an iterator
* at the end of the queue. The elements are prepended in the order they
* are given out by the iterator.
*
* @param iter an iterator
*/
- override def ++=(it: Iterator[A]): this.type = {
+ override def ++=(it: TraversableOnce[A]): this.type = {
self ++= it
this
}
diff --git a/src/library/scala/collection/mutable/ResizableArray.scala b/src/library/scala/collection/mutable/ResizableArray.scala
index 435eb3ee0f..80ab1cd559 100644
--- a/src/library/scala/collection/mutable/ResizableArray.scala
+++ b/src/library/scala/collection/mutable/ResizableArray.scala
@@ -24,7 +24,7 @@ import generic._
*/
trait ResizableArray[A] extends IndexedSeq[A]
with GenericTraversableTemplate[A, ResizableArray]
- with IndexedSeqLike[A, ResizableArray[A]] {
+ with IndexedSeqOptimized[A, ResizableArray[A]] {
override def companion: GenericCompanion[ResizableArray] = ResizableArray
diff --git a/src/library/scala/collection/mutable/SetBuilder.scala b/src/library/scala/collection/mutable/SetBuilder.scala
index 6286c46ac4..450d76463c 100644
--- a/src/library/scala/collection/mutable/SetBuilder.scala
+++ b/src/library/scala/collection/mutable/SetBuilder.scala
@@ -13,17 +13,10 @@ package mutable
import generic._
-/** The canonical builder for collections that are addable, i.e. that support
- * an efficient + method which adds an element to the collection.
- * Collections are built from their empty element using this + method.
- * @param empty The empty element of the collection.
+/** The canonical builder for mutable Sets.
*
+ * @param empty The empty element of the collection.
* @since 2.8
*/
-class SetBuilder[A, Coll <: Addable[A, Coll] with scala.collection.Iterable[A] with scala.collection.IterableLike[A, Coll]](empty: Coll)
-extends Builder[A, Coll] {
- protected var elems: Coll = empty
- def +=(x: A): this.type = { elems = elems + x; this }
- def clear() { elems = empty }
- def result: Coll = elems
-}
+class SetBuilder[A, Coll <: Addable[A, Coll] with collection.Iterable[A] with collection.IterableLike[A, Coll]](empty: Coll)
+extends AddingBuilder[A, Coll](empty) { }
diff --git a/src/library/scala/collection/mutable/SetLike.scala b/src/library/scala/collection/mutable/SetLike.scala
index cb6eb293c1..7004e52b8e 100644
--- a/src/library/scala/collection/mutable/SetLike.scala
+++ b/src/library/scala/collection/mutable/SetLike.scala
@@ -14,6 +14,7 @@ package mutable
import generic._
import script._
+import scala.annotation.migration
/** A template trait for mutable sets of type `mutable.Set[A]`.
* @tparam A the type of the elements of the set
@@ -63,6 +64,9 @@ trait SetLike[A, +This <: SetLike[A, This] with Set[A]]
*/
override protected[this] def newBuilder: Builder[A, This] = empty
+ @migration(2, 8, "Set.map now returns a Set, so it will discard duplicate values.")
+ override def map[B, That](f: A => B)(implicit bf: CanBuildFrom[This, B, That]): That = super.map(f)(bf)
+
/** Adds an element to this $coll.
*
* @param elem the element to be added
@@ -119,7 +123,7 @@ trait SetLike[A, +This <: SetLike[A, This] with Set[A]]
*/
def clear() { foreach(-=) }
- override def clone(): mutable.Set[A] = empty ++= repr
+ override def clone(): This = empty ++= repr
/** The result when this set is used as a builder
* @return the set representation itself.
@@ -131,9 +135,11 @@ trait SetLike[A, +This <: SetLike[A, This] with Set[A]]
*
* @param elem the element to add.
*/
- @deprecated("Use += instead if you intend to add by side effect to an existing collection.\n"+
- "Use `clone() +=' if you intend to create a new collection.")
- override def + (elem: A): This = { +=(elem); repr }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new set. To add an element as a\n"+
+ "side effect to an existing set and return that set itself, use +=."
+ )
+ override def + (elem: A): This = clone() += elem
/** Adds two or more elements to this collection and returns
* the collection itself.
@@ -142,45 +148,34 @@ trait SetLike[A, +This <: SetLike[A, This] with Set[A]]
* @param elem2 the second element to add.
* @param elems the remaining elements to add.
*/
- @deprecated("Use += instead if you intend to add by side effect to an existing collection.\n"+
- "Use `clone() +=' if you intend to create a new collection.")
- override def + (elem1: A, elem2: A, elems: A*): This = {
- this += elem1 += elem2 ++= elems
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new set. To add the elements as a\n"+
+ "side effect to an existing set and return that set itself, use +=."
+ )
+ override def + (elem1: A, elem2: A, elems: A*): This =
+ clone() += elem1 += elem2 ++= elems
/** Adds a number of elements provided by a traversable object and returns
* either the collection itself.
*
* @param iter the iterable object.
*/
- @deprecated("Use ++= instead if you intend to add by side effect to an existing collection.\n"+
- "Use `clone() ++=' if you intend to create a new collection.")
- override def ++(iter: scala.collection.Traversable[A]): This = {
- for (elem <- iter) +=(elem)
- repr
- }
-
- /** Adds a number of elements provided by an iterator and returns
- * the collection itself.
- *
- * @param iter the iterator
- */
- @deprecated("Use ++= instead if you intend to add by side effect to an existing collection.\n"+
- "Use `clone() ++=' if you intend to create a new collection.")
- override def ++ (iter: Iterator[A]): This = {
- for (elem <- iter) +=(elem)
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new set. To add the elements as a\n"+
+ "side effect to an existing set and return that set itself, use ++=."
+ )
+ override def ++(xs: TraversableOnce[A]): This = clone() ++= xs
/** Removes a single element from this collection and returns
* the collection itself.
*
* @param elem the element to remove.
*/
- @deprecated("Use -= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() -=' if you intend to create a new collection.")
- override def -(elem: A): This = { -=(elem); repr }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new set. To remove the element as a\n"+
+ "side effect to an existing set and return that set itself, use -=."
+ )
+ override def -(elem: A): This = clone() -= elem
/** Removes two or more elements from this collection and returns
* the collection itself.
@@ -189,36 +184,23 @@ trait SetLike[A, +This <: SetLike[A, This] with Set[A]]
* @param elem2 the second element to remove.
* @param elems the remaining elements to remove.
*/
- @deprecated("Use -= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() -=' if you intend to create a new collection.")
- override def -(elem1: A, elem2: A, elems: A*): This = {
- this -= elem1 -= elem2 --= elems
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new set. To remove the elements as a\n"+
+ "side effect to an existing set and return that set itself, use -=."
+ )
+ override def -(elem1: A, elem2: A, elems: A*): This =
+ clone() -= elem1 -= elem2 --= elems
/** Removes a number of elements provided by a Traversable object and returns
* the collection itself.
*
* @param iter the Traversable object.
*/
- @deprecated("Use --= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() --=' if you intend to create a new collection.")
- override def --(iter: scala.collection.Traversable[A]): This = {
- for (elem <- iter) -=(elem)
- repr
- }
-
- /** Removes a number of elements provided by an iterator and returns
- * the collection itself.
- *
- * @param iter the iterator
- */
- @deprecated("Use --= instead if you intend to remove by side effect from an existing collection.\n"+
- "Use `clone() --=' if you intend to create a new collection.")
- override def --(iter: Iterator[A]): This = {
- for (elem <- iter) -=(elem)
- repr
- }
+ @migration(2, 8,
+ "As of 2.8, this operation creates a new set. To remove the elements as a\n"+
+ "side effect to an existing set and return that set itself, use --=."
+ )
+ override def --(xs: TraversableOnce[A]): This = clone() --= xs
/** Send a message to this scriptable object.
*
diff --git a/src/library/scala/collection/mutable/Stack.scala b/src/library/scala/collection/mutable/Stack.scala
index bbb4189dc3..45e9fa24b2 100644
--- a/src/library/scala/collection/mutable/Stack.scala
+++ b/src/library/scala/collection/mutable/Stack.scala
@@ -15,6 +15,7 @@ package mutable
import generic._
import collection.immutable.{List, Nil}
import collection.Iterator
+import annotation.migration
/** A stack implements a data structure which allows to store and retrieve
* objects in a last-in-first-out (LIFO) fashion.
@@ -63,19 +64,11 @@ class Stack[A] private (var elems: List[A]) extends scala.collection.Seq[A] with
* @param elems the iterator object.
* @return the stack with the new elements on top.
*/
- def pushAll(elems: Iterator[A]): this.type = { for (elem <- elems) { push(elem); () }; this }
+ def pushAll(xs: TraversableOnce[A]): this.type = { xs foreach push ; this }
- /** Push all elements provided by the given iterable object onto
- * the stack. The last element returned by the traversable object
- * will be on top of the new stack.
- *
- * @param elems the iterable object.
- * @return the stack with the new elements on top.
- */
- def pushAll(elems: scala.collection.Traversable[A]): this.type = { for (elem <- elems) { push(elem); () }; this }
-
- @deprecated("use pushAll") def ++=(it: Iterator[A]): this.type = pushAll(it)
- @deprecated("use pushAll") def ++=(it: scala.collection.Iterable[A]): this.type = pushAll(it)
+ @deprecated("use pushAll")
+ @migration(2, 8, "Stack ++= now pushes arguments on the stack from left to right.")
+ def ++=(xs: TraversableOnce[A]): this.type = pushAll(xs)
/** Returns the top element of the stack. This method will not remove
* the element from the stack. An error is signaled if there is no
@@ -112,17 +105,27 @@ class Stack[A] private (var elems: List[A]) extends scala.collection.Seq[A] with
*
* @return an iterator over all stack elements.
*/
+ @migration(2, 8, "Stack iterator and foreach now traverse in FIFO order.")
override def iterator: Iterator[A] = elems.iterator
/** Creates a list of all stack elements in LIFO order.
*
* @return the created list.
*/
+ @migration(2, 8, "Stack iterator and foreach now traverse in FIFO order.")
override def toList: List[A] = elems
+ @migration(2, 8, "Stack iterator and foreach now traverse in FIFO order.")
+ override def foreach[U](f: A => U): Unit = super.foreach(f)
+
/** This method clones the stack.
*
* @return a stack with the same elements.
*/
override def clone(): Stack[A] = new Stack[A](elems)
}
+
+// !!! TODO - integrate
+object Stack {
+ def apply[A](xs: A*): Stack[A] = new Stack[A] ++= xs
+}
diff --git a/src/library/scala/collection/mutable/StackProxy.scala b/src/library/scala/collection/mutable/StackProxy.scala
index 99f556112a..d3810dd158 100644
--- a/src/library/scala/collection/mutable/StackProxy.scala
+++ b/src/library/scala/collection/mutable/StackProxy.scala
@@ -48,15 +48,7 @@ trait StackProxy[A] extends Stack[A] with Proxy {
this
}
- override def pushAll(elems: Iterator[A]): this.type = {
- self pushAll elems
- this
- }
-
- override def pushAll(elems: scala.collection.Traversable[A]): this.type = {
- self pushAll elems
- this
- }
+ override def pushAll(xs: TraversableOnce[A]): this.type = { self pushAll xs; this }
/** Pushes all elements provided by an <code>Iterable</code> object
* on top of the stack. The elements are pushed in the order they
@@ -64,21 +56,8 @@ trait StackProxy[A] extends Stack[A] with Proxy {
*
* @param iter an iterable object
*/
- @deprecated("use pushAll") override def ++=(iter: scala.collection.Iterable[A]): this.type = {
- self ++= iter
- this
- }
+ @deprecated("use pushAll") override def ++=(xs: TraversableOnce[A]): this.type = { self ++= xs ; this }
- /** Pushes all elements provided by an iterator
- * on top of the stack. The elements are pushed in the order they
- * are given out by the iterator.
- *
- * @param iter an iterator
- */
- @deprecated("use pushAll") override def ++=(it: Iterator[A]): this.type = {
- self ++= it
- this
- }
override def push(elem1: A, elem2: A, elems: A*): this.type = {
self.push(elem1).push(elem2).pushAll(elems)
diff --git a/src/library/scala/collection/mutable/SynchronizedBuffer.scala b/src/library/scala/collection/mutable/SynchronizedBuffer.scala
index f6caa57729..1c9a77c46a 100644
--- a/src/library/scala/collection/mutable/SynchronizedBuffer.scala
+++ b/src/library/scala/collection/mutable/SynchronizedBuffer.scala
@@ -60,8 +60,8 @@ trait SynchronizedBuffer[A] extends Buffer[A] {
*
* @param iter the iterable object.
*/
- override def ++(iter: Traversable[A]): Self = synchronized {
- super.++(iter)
+ override def ++(xs: TraversableOnce[A]): Self = synchronized {
+ super.++(xs)
}
/** Appends a number of elements provided by an iterable object
@@ -69,8 +69,8 @@ trait SynchronizedBuffer[A] extends Buffer[A] {
*
* @param iter the iterable object.
*/
- override def ++=(iter: Traversable[A]): this.type = synchronized[this.type] {
- super.++=(iter)
+ override def ++=(xs: TraversableOnce[A]): this.type = synchronized[this.type] {
+ super.++=(xs)
}
/** Appends a sequence of elements to this buffer.
@@ -86,8 +86,8 @@ trait SynchronizedBuffer[A] extends Buffer[A] {
*
* @param iter the iterable object.
*/
- override def appendAll(iter: Traversable[A]): Unit = synchronized {
- super.appendAll(iter)
+ override def appendAll(xs: TraversableOnce[A]): Unit = synchronized {
+ super.appendAll(xs)
}
/** Prepend a single element to this buffer and return
@@ -105,17 +105,13 @@ trait SynchronizedBuffer[A] extends Buffer[A] {
*
* @param iter the iterable object.
*/
- override def ++=:(iter: Traversable[A]): this.type = synchronized[this.type] {
- super.++=:(iter)
- }
+ override def ++=:(xs: TraversableOnce[A]): this.type = synchronized[this.type] { super.++=:(xs) }
/** Prepend an element to this list.
*
* @param elem the element to prepend.
*/
- override def prepend(elems: A*): Unit = synchronized {
- super.prependAll(elems)
- }
+ override def prepend(elems: A*): Unit = prependAll(elems)
/** Prepends a number of elements provided by an iterable object
* via its <code>iterator</code> method. The identity of the
@@ -123,8 +119,8 @@ trait SynchronizedBuffer[A] extends Buffer[A] {
*
* @param iter the iterable object.
*/
- override def prependAll(elems: Traversable[A]): Unit = synchronized {
- super.prependAll(elems)
+ override def prependAll(xs: TraversableOnce[A]): Unit = synchronized {
+ super.prependAll(xs)
}
/** Inserts new elements at the index <code>n</code>. Opposed to method
diff --git a/src/library/scala/collection/mutable/SynchronizedMap.scala b/src/library/scala/collection/mutable/SynchronizedMap.scala
index ca29fa20b8..dabcaa7e1c 100644
--- a/src/library/scala/collection/mutable/SynchronizedMap.scala
+++ b/src/library/scala/collection/mutable/SynchronizedMap.scala
@@ -12,6 +12,7 @@
package scala.collection
package mutable
+import annotation.migration
/** This class should be used as a mixin. It synchronizes the <code>Map</code>
* functions of the class into which it is mixed in.
@@ -35,20 +36,21 @@ trait SynchronizedMap[A, B] extends Map[A, B] {
override def getOrElseUpdate(key: A, default: => B): B = synchronized { super.getOrElseUpdate(key, default) }
override def transform(f: (A, B) => B): this.type = synchronized[this.type] { super.transform(f) }
override def retain(p: (A, B) => Boolean): this.type = synchronized[this.type] { super.retain(p) }
- override def valuesIterable: scala.collection.Iterable[B] = synchronized { super.valuesIterable }
- @deprecated("Use `valuesIterator' instead") override def values: Iterator[B] = synchronized { super.valuesIterator }
+ @migration(2, 8, "As of 2.8, values returns Iterable[B] rather than Iterator[B].")
+ override def values: collection.Iterable[B] = synchronized { super.values }
override def valuesIterator: Iterator[B] = synchronized { super.valuesIterator }
override def clone(): Self = synchronized { super.clone() }
override def foreach[U](f: ((A, B)) => U) = synchronized { super.foreach(f) }
override def apply(key: A): B = synchronized { super.apply(key) }
- override def keySet: scala.collection.Set[A] = synchronized { super.keySet }
- @deprecated("Use `keysIterator' instead") override def keys: Iterator[A] = synchronized { super.keysIterator }
+ override def keySet: collection.Set[A] = synchronized { super.keySet }
+ @migration(2, 8, "As of 2.8, keys returns Iterable[A] rather than Iterator[A].")
+ override def keys: collection.Iterable[A] = synchronized { super.keys }
override def keysIterator: Iterator[A] = synchronized { super.keysIterator }
override def isEmpty: Boolean = synchronized { super.isEmpty }
override def contains(key: A): Boolean = synchronized {super.contains(key) }
override def isDefinedAt(key: A) = synchronized { super.isDefinedAt(key) }
- @deprecated("See Map.+ for explanation") override def +(kv: (A, B)): this.type = synchronized[this.type] { super.+(kv) }
+ // @deprecated("See Map.+ for explanation") override def +(kv: (A, B)): this.type = synchronized[this.type] { super.+(kv) }
// can't override -, -- same type!
// @deprecated override def -(key: A): Self = synchronized { super.-(key) }
diff --git a/src/library/scala/collection/mutable/SynchronizedPriorityQueue.scala b/src/library/scala/collection/mutable/SynchronizedPriorityQueue.scala
index 066c96a651..9d18846252 100644
--- a/src/library/scala/collection/mutable/SynchronizedPriorityQueue.scala
+++ b/src/library/scala/collection/mutable/SynchronizedPriorityQueue.scala
@@ -39,23 +39,11 @@ class SynchronizedPriorityQueue[A](implicit ord: Ordering[A]) extends PriorityQu
this
}
- /** Adds all elements provided by an <code>Iterable</code> object
- * into the priority queue.
- *
- * @param iter an iterable object
- */
- def ++=(iter: scala.collection.Iterable[A]): this.type = {
- synchronized {
- super.++=(iter)
- }
- this
- }
-
/** Adds all elements provided by an iterator into the priority queue.
*
* @param it an iterator
*/
- override def ++=(it: Iterator[A]): this.type = {
+ override def ++=(it: TraversableOnce[A]): this.type = {
synchronized {
super.++=(it)
}
@@ -87,7 +75,7 @@ class SynchronizedPriorityQueue[A](implicit ord: Ordering[A]) extends PriorityQu
*/
override def clear(): Unit = synchronized { super.clear }
- /** Returns an iterator which yiels all the elements of the priority
+ /** Returns an iterator which yield all the elements of the priority
* queue in descending priority order.
*
* @return an iterator over all elements sorted in descending order.
diff --git a/src/library/scala/collection/mutable/SynchronizedQueue.scala b/src/library/scala/collection/mutable/SynchronizedQueue.scala
index 3a1bc2e383..e7630cee06 100644
--- a/src/library/scala/collection/mutable/SynchronizedQueue.scala
+++ b/src/library/scala/collection/mutable/SynchronizedQueue.scala
@@ -42,15 +42,7 @@ class SynchronizedQueue[A] extends Queue[A] {
*
* @param iter an iterable object
*/
- override def ++=(iter: Traversable[A]): this.type = synchronized[this.type] { super.++=(iter) }
-
- /** Adds all elements provided by an iterator
- * at the end of the queue. The elements are prepended in the order they
- * are given out by the iterator.
- *
- * @param it an iterator
- */
- override def ++=(it: Iterator[A]): this.type = synchronized[this.type] { super.++=(it) }
+ override def ++=(xs: TraversableOnce[A]): this.type = synchronized[this.type] { super.++=(xs) }
/** Adds all elements to the queue.
*
diff --git a/src/library/scala/collection/mutable/SynchronizedSet.scala b/src/library/scala/collection/mutable/SynchronizedSet.scala
index a4832ba9f4..d3023b9136 100644
--- a/src/library/scala/collection/mutable/SynchronizedSet.scala
+++ b/src/library/scala/collection/mutable/SynchronizedSet.scala
@@ -39,24 +39,16 @@ trait SynchronizedSet[A] extends Set[A] {
super.+=(elem)
}
- override def ++=(that: Traversable[A]): this.type = synchronized[this.type] {
- super.++=(that)
- }
-
- override def ++=(it: Iterator[A]): this.type = synchronized[this.type] {
- super.++=(it)
+ override def ++=(xs: TraversableOnce[A]): this.type = synchronized[this.type] {
+ super.++=(xs)
}
abstract override def -=(elem: A): this.type = synchronized[this.type] {
super.-=(elem)
}
- override def --=(that: Traversable[A]): this.type = synchronized[this.type] {
- super.--=(that)
- }
-
- override def --=(it: Iterator[A]): this.type = synchronized[this.type] {
- super.--=(it)
+ override def --=(xs: TraversableOnce[A]): this.type = synchronized[this.type] {
+ super.--=(xs)
}
override def update(elem: A, included: Boolean): Unit = synchronized {
@@ -103,7 +95,7 @@ trait SynchronizedSet[A] extends Set[A] {
super.<<(cmd)
}
- override def clone(): Set[A] = synchronized {
+ override def clone(): Self = synchronized {
super.clone()
}
}
diff --git a/src/library/scala/collection/mutable/SynchronizedStack.scala b/src/library/scala/collection/mutable/SynchronizedStack.scala
index ff2f986244..4394d307eb 100644
--- a/src/library/scala/collection/mutable/SynchronizedStack.scala
+++ b/src/library/scala/collection/mutable/SynchronizedStack.scala
@@ -44,21 +44,13 @@ class SynchronizedStack[A] extends Stack[A] {
*/
override def push(elem1: A, elem2: A, elems: A*): this.type = synchronized[this.type] { super.push(elem1, elem2, elems: _*) }
- /** Pushes all elements provided by an <code>Traversable</code> object
- * on top of the stack. The elements are pushed in the order they
- * are given out by the iterator.
- *
- * @param iter an iterable object
- */
- override def pushAll(elems: scala.collection.Traversable[A]): this.type = synchronized[this.type] { super.pushAll(elems) }
-
/** Pushes all elements provided by an iterator
* on top of the stack. The elements are pushed in the order they
* are given out by the iterator.
*
* @param elems an iterator
*/
- override def pushAll(elems: Iterator[A]): this.type = synchronized[this.type] { super.pushAll(elems) }
+ override def pushAll(xs: TraversableOnce[A]): this.type = synchronized[this.type] { super.pushAll(elems) }
/** Returns the top element of the stack. This method will not remove
* the element from the stack. An error is signaled if there is no
diff --git a/src/library/scala/collection/mutable/WeakHashMap.scala b/src/library/scala/collection/mutable/WeakHashMap.scala
index 81c91dec3d..cad4dc2e43 100644
--- a/src/library/scala/collection/mutable/WeakHashMap.scala
+++ b/src/library/scala/collection/mutable/WeakHashMap.scala
@@ -13,10 +13,19 @@ package scala.collection
package mutable
import JavaConversions._
+import generic._
+
/**
* @since 2.8
*/
-class WeakHashMap[A, B] extends JMapWrapper[A, B](new java.util.WeakHashMap) {
+class WeakHashMap[A, B] extends JMapWrapper[A, B](new java.util.WeakHashMap)
+ with JMapWrapperLike[A, B, WeakHashMap[A, B]] {
override def empty = new WeakHashMap[A, B]
}
+
+object WeakHashMap extends MutableMapFactory[WeakHashMap] {
+ implicit def canBuildFrom[A, B]: CanBuildFrom[Coll, (A, B), WeakHashMap[A, B]] = new MapCanBuildFrom[A, B]
+ def empty[A, B]: WeakHashMap[A, B] = new WeakHashMap[A, B]
+}
+
diff --git a/src/library/scala/collection/mutable/WrappedArray.scala b/src/library/scala/collection/mutable/WrappedArray.scala
index 6652f5e40a..10117a1086 100644
--- a/src/library/scala/collection/mutable/WrappedArray.scala
+++ b/src/library/scala/collection/mutable/WrappedArray.scala
@@ -41,6 +41,13 @@ abstract class WrappedArray[T] extends IndexedSeq[T] with ArrayLike[T, WrappedAr
/** The underlying array */
def array: Array[T]
+
+ override def toArray[U >: T : ClassManifest]: Array[U] =
+ if (implicitly[ClassManifest[U]].erasure eq array.getClass.getComponentType)
+ array.asInstanceOf[Array[U]]
+ else
+ super.toArray[U]
+
override def stringPrefix = "WrappedArray"
/** Clones this object, including the underlying Array. */
diff --git a/src/library/scala/collection/readme-if-you-want-to-add-something.txt b/src/library/scala/collection/readme-if-you-want-to-add-something.txt
new file mode 100755
index 0000000000..6700cb7b68
--- /dev/null
+++ b/src/library/scala/collection/readme-if-you-want-to-add-something.txt
@@ -0,0 +1,50 @@
+Conventions for Collection Implementors
+
+Martin Odersky
+19 Mar 2010
+
+This note describes some conventions which must be followed to keep
+the collection libraries consistent.
+
+We distinguish in the following between two kinds of methods
+
+ - ``Accessors'' access some of the elements of a collection, but return a result which
+ is unrelated to the collection.
+ Example of accessors are: head, foldLeft, indexWhere, toSeq.
+
+ - ``Transformers'' access elements of a collection and produce a new collection of related
+ type as a result. The relation might either be direct (same type as receiver)
+ or indirect, linked by a CanBuildFrom implicit.
+ Example of transformers are: filter, map, groupBy, zip.
+
+1. Proxies
+
+Every collection type has a Proxy class that forwards all operations to
+an underlying collection. Proxy methods are all implemented in classes
+with names ending in `ProxyLike'. If you add a new method to a collection
+class you need to add the same method to the corresponding ProxyLike class.
+
+2. Forwarders
+
+Classes Traversable, Iterable, and Seq also have forwarders, which
+forward all collection-specific accessor operations to an underlying
+collection. These are defined as classes with names ending
+in `Forwarder' in package collection.generic. If you add a new
+accessor method to a Seq or one of its collection superclasses, you
+need to add the same method to the corresponding forwarder class.
+
+3. Views
+
+Classes Traversable, Iterable, Seq, IndexedSeq, and mutable.IndexedSeq
+support views. Their operations are all defined in classes with names
+ending in `ViewLike'. If you add a new transformer method to one of
+the above collection classes, you need to add the same method to the
+corresponding view class. Failure to do so will cause the
+corresponding method to fail at runtime with an exception like
+UnsupportedOperationException("coll.newBuilder"). If there is no good
+way to implement the operation in question lazily, there's a fallback
+using the newForced method. See the definition of sorted in trait
+SeqViewLike as an example.
+
+
+
diff --git a/src/library/scala/compat/Platform.scala b/src/library/scala/compat/Platform.scala
index f7f5070699..7580d2cc0e 100644
--- a/src/library/scala/compat/Platform.scala
+++ b/src/library/scala/compat/Platform.scala
@@ -47,7 +47,7 @@ object Platform {
@inline
def getClassForName(name: String): Class[_] = java.lang.Class.forName(name)
- val EOL = System.getProperty("line.separator", "\n")
+ val EOL = util.Properties.lineSeparator
@inline
def currentTime: Long = System.currentTimeMillis()
diff --git a/src/library/scala/concurrent/DelayedLazyVal.scala b/src/library/scala/concurrent/DelayedLazyVal.scala
index 092800cb10..7c5d43e70c 100644
--- a/src/library/scala/concurrent/DelayedLazyVal.scala
+++ b/src/library/scala/concurrent/DelayedLazyVal.scala
@@ -27,9 +27,15 @@ import ops.future
* @version 2.8
*/
class DelayedLazyVal[T](f: () => T, body: => Unit) {
- @volatile private[this] var isDone = false
+ @volatile private[this] var _isDone = false
private[this] lazy val complete = f()
+ /** Whether the computation is complete.
+ *
+ * @return true if the computation is complete.
+ */
+ def isDone = _isDone
+
/** The current result of f(), or the final result if complete.
*
* @return the current value
@@ -38,6 +44,6 @@ class DelayedLazyVal[T](f: () => T, body: => Unit) {
future {
body
- isDone = true
+ _isDone = true
}
}
diff --git a/src/library/scala/math/BigDecimal.scala b/src/library/scala/math/BigDecimal.scala
index 6bd6b33484..bb6965fcdc 100644
--- a/src/library/scala/math/BigDecimal.scala
+++ b/src/library/scala/math/BigDecimal.scala
@@ -29,6 +29,8 @@ object BigDecimal
private val minCached = -512
private val maxCached = 512
+
+ /** Cache ony for defaultMathContext using BigDecimals in a small range. */
private lazy val cache = new Array[BigDecimal](maxCached - minCached + 1)
val defaultMathContext = MathContext.UNLIMITED
@@ -50,12 +52,13 @@ object BigDecimal
*/
def apply(i: Int): BigDecimal = apply(i, defaultMathContext)
def apply(i: Int, mc: MathContext): BigDecimal =
- if (minCached <= i && i <= maxCached) {
+ if (mc == defaultMathContext && minCached <= i && i <= maxCached) {
val offset = i - minCached
var n = cache(offset)
if (n eq null) { n = new BigDecimal(BigDec.valueOf(i), mc); cache(offset) = n }
n
- } else new BigDecimal(BigDec.valueOf(i), mc)
+ }
+ else new BigDecimal(BigDec.valueOf(i), mc)
/** Constructs a <code>BigDecimal</code> whose value is equal to that of the
* specified long value.
diff --git a/src/library/scala/math/BigInt.scala b/src/library/scala/math/BigInt.scala
index 4c9f970cb4..5267ad8b95 100644
--- a/src/library/scala/math/BigInt.scala
+++ b/src/library/scala/math/BigInt.scala
@@ -102,7 +102,7 @@ object BigInt {
*/
implicit def int2bigInt(i: Int): BigInt = apply(i)
- /** Implicit copnversion from long to BigInt
+ /** Implicit conversion from long to BigInt
*/
implicit def long2bigInt(l: Long): BigInt = apply(l)
}
diff --git a/src/library/scala/math/Numeric.scala b/src/library/scala/math/Numeric.scala
index fc8e7c307d..65f213a08e 100644
--- a/src/library/scala/math/Numeric.scala
+++ b/src/library/scala/math/Numeric.scala
@@ -75,6 +75,21 @@ object Numeric {
}
implicit object ByteIsIntegral extends ByteIsIntegral with Ordering.ByteOrdering
+ trait CharIsIntegral extends Integral[Char] {
+ def plus(x: Char, y: Char): Char = (x + y).toChar
+ def minus(x: Char, y: Char): Char = (x - y).toChar
+ def times(x: Char, y: Char): Char = (x * y).toChar
+ def quot(x: Char, y: Char): Char = (x / y).toChar
+ def rem(x: Char, y: Char): Char = (x % y).toChar
+ def negate(x: Char): Char = (-x).toChar
+ def fromInt(x: Int): Char = x.toChar
+ def toInt(x: Char): Int = x.toInt
+ def toLong(x: Char): Long = x.toLong
+ def toFloat(x: Char): Float = x.toFloat
+ def toDouble(x: Char): Double = x.toDouble
+ }
+ implicit object CharIsIntegral extends CharIsIntegral with Ordering.CharOrdering
+
trait LongIsIntegral extends Integral[Long] {
def plus(x: Long, y: Long): Long = x + y
def minus(x: Long, y: Long): Long = x - y
diff --git a/src/library/scala/math/Ordering.scala b/src/library/scala/math/Ordering.scala
index 1660cdb99e..04c2d96aba 100644
--- a/src/library/scala/math/Ordering.scala
+++ b/src/library/scala/math/Ordering.scala
@@ -133,6 +133,8 @@ object Ordering extends LowPriorityOrderingImplicits {
override def lteq(x: T, y: T): Boolean = !cmp(y, x)
}
+ def by[T, S: Ordering](f: T => S): Ordering[T] = fromLessThan((x, y) => implicitly[Ordering[S]].lt(f(x), f(y)))
+
trait UnitOrdering extends Ordering[Unit] {
def compare(x: Unit, y: Unit) = 0
}
diff --git a/src/library/scala/package.scala b/src/library/scala/package.scala
index 9fa09e3b72..9f31623bdf 100644
--- a/src/library/scala/package.scala
+++ b/src/library/scala/package.scala
@@ -26,6 +26,8 @@ package object scala {
type NumberFormatException = java.lang.NumberFormatException
type AbstractMethodError = java.lang.AbstractMethodError
+ type TraversableOnce[+A] = scala.collection.TraversableOnce[A]
+
type Traversable[+A] = scala.collection.Traversable[A]
val Traversable = scala.collection.Traversable
diff --git a/src/library/scala/reflect/ClassManifest.scala b/src/library/scala/reflect/ClassManifest.scala
index 58f3c89499..ded013a4b5 100644
--- a/src/library/scala/reflect/ClassManifest.scala
+++ b/src/library/scala/reflect/ClassManifest.scala
@@ -27,7 +27,7 @@ import scala.collection.mutable.{WrappedArray, ArrayBuilder}
* </p>
*/
@serializable
-trait ClassManifest[T] extends OptManifest[T] {
+trait ClassManifest[T] extends OptManifest[T] with Equals {
/** A class representing the type U to which T would be erased. Note
* that there is no subtyping relationship between T and U. */
@@ -73,15 +73,20 @@ trait ClassManifest[T] extends OptManifest[T] {
def >:>(that: ClassManifest[_]): Boolean =
that <:< this
+ def canEqual(other: Any) = other match {
+ case _: ClassManifest[_] => true
+ case _ => false
+ }
+
/** Tests whether the type represented by this manifest is equal to the
* type represented by `that' manifest. BE AWARE: the current
* implementation is an approximation, as the test is done on the
* erasure of the type. */
override def equals(that: Any): Boolean = that match {
- case _: AnyValManifest[_] => false
- case m: ClassManifest[_] => this.erasure == m.erasure
+ case m: ClassManifest[_] if m canEqual this => this.erasure == m.erasure
case _ => false
}
+ override def hashCode = this.erasure.hashCode
protected def arrayClass[T](tp: Predef.Class[_]): Predef.Class[Array[T]] =
java.lang.reflect.Array.newInstance(tp, 0).getClass.asInstanceOf[Predef.Class[Array[T]]]
@@ -225,11 +230,4 @@ object ClassManifest {
override val typeArguments = args.toList
override def toString = prefix.toString+"#"+name+argString
}
-
- /** ClassManifest for the intersection type `parents_0 with ... with parents_n'. */
- def intersectionType[T](parents: ClassManifest[_]*): ClassManifest[T] =
- new (ClassManifest[T] @serializable) {
- def erasure = parents.head.erasure
- override def toString = parents.mkString(" with ")
- }
}
diff --git a/src/library/scala/reflect/Code.scala b/src/library/scala/reflect/Code.scala
index 71e148db81..61138f2495 100644
--- a/src/library/scala/reflect/Code.scala
+++ b/src/library/scala/reflect/Code.scala
@@ -12,7 +12,7 @@
package scala.reflect
/** This type is required by the compiler and <b>should not be used in client code</b>. */
-class Code[Type](val tree: Tree)
+class Code[T](val tree: Tree)
/** This type is required by the compiler and <b>should not be used in client code</b>. */
object Code {
diff --git a/src/library/scala/reflect/Manifest.scala b/src/library/scala/reflect/Manifest.scala
index 69842e1193..b7cb86e1bd 100644
--- a/src/library/scala/reflect/Manifest.scala
+++ b/src/library/scala/reflect/Manifest.scala
@@ -27,18 +27,33 @@ import scala.collection.immutable.{List, Nil}
* </p>
*/
@serializable
-trait Manifest[T] extends ClassManifest[T] {
+trait Manifest[T] extends ClassManifest[T] with Equals {
override def typeArguments: List[Manifest[_]] = List()
override def arrayManifest: Manifest[Array[T]] =
Manifest.classType[Array[T]](arrayClass[T](erasure))
+
+ override def canEqual(that: Any): Boolean = that match {
+ case _: Manifest[_] => true
+ case _ => false
+ }
+ override def equals(that: Any): Boolean = that match {
+ case m: Manifest[_] if m canEqual this => (this <:< m) && (m <:< this)
+ case _ => false
+ }
+ override def hashCode = this.erasure.hashCode
}
@serializable
-trait AnyValManifest[T] extends Manifest[T] {
+trait AnyValManifest[T] extends Manifest[T] with Equals {
import Manifest.{ Any, AnyVal }
override def <:<(that: ClassManifest[_]): Boolean = (that eq this) || (that eq Any) || (that eq AnyVal)
+ override def canEqual(other: Any) = other match {
+ case _: AnyValManifest[_] => true
+ case _ => false
+ }
override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef]
+ override def hashCode = System.identityHashCode(this)
}
/** <ps>
@@ -137,6 +152,7 @@ object Manifest {
override def toString = "Any"
override def <:<(that: ClassManifest[_]): Boolean = (that eq this)
override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef]
+ override def hashCode = System.identityHashCode(this)
private def readResolve(): Any = Manifest.Any
}
@@ -144,6 +160,7 @@ object Manifest {
override def toString = "Object"
override def <:<(that: ClassManifest[_]): Boolean = (that eq this) || (that eq Any)
override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef]
+ override def hashCode = System.identityHashCode(this)
private def readResolve(): Any = Manifest.Object
}
@@ -151,6 +168,7 @@ object Manifest {
override def toString = "AnyVal"
override def <:<(that: ClassManifest[_]): Boolean = (that eq this) || (that eq Any)
override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef]
+ override def hashCode = System.identityHashCode(this)
private def readResolve(): Any = Manifest.AnyVal
}
@@ -159,6 +177,7 @@ object Manifest {
override def <:<(that: ClassManifest[_]): Boolean =
(that ne null) && (that ne Nothing) && !(that <:< AnyVal)
override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef]
+ override def hashCode = System.identityHashCode(this)
private def readResolve(): Any = Manifest.Null
}
@@ -166,6 +185,7 @@ object Manifest {
override def toString = "Nothing"
override def <:<(that: ClassManifest[_]): Boolean = (that ne null)
override def equals(that: Any): Boolean = this eq that.asInstanceOf[AnyRef]
+ override def hashCode = System.identityHashCode(this)
private def readResolve(): Any = Manifest.Nothing
}
diff --git a/src/library/scala/util/NameTransformer.scala b/src/library/scala/reflect/NameTransformer.scala
index 83451240d5..0629c3a2f8 100644..100755
--- a/src/library/scala/util/NameTransformer.scala
+++ b/src/library/scala/reflect/NameTransformer.scala
@@ -6,10 +6,10 @@
** |/ **
\* */
-// $Id$
+// $Id: NameTransformer.scala 20028 2009-12-07 11:49:19Z cunei $
-package scala.util
+package scala.reflect
/**
* @author Martin Odersky
diff --git a/src/library/scala/reflect/ScalaSignature.java b/src/library/scala/reflect/ScalaSignature.java
new file mode 100644
index 0000000000..d1cdbc0589
--- /dev/null
+++ b/src/library/scala/reflect/ScalaSignature.java
@@ -0,0 +1,13 @@
+package scala.reflect;
+
+import java.lang.annotation.ElementType;
+import java.lang.annotation.Retention;
+import java.lang.annotation.RetentionPolicy;
+import java.lang.annotation.Target;
+
+/** */
+@Retention(RetentionPolicy.RUNTIME)
+@Target(ElementType.TYPE)
+public @interface ScalaSignature {
+ public String bytes();
+}
diff --git a/src/library/scala/reflect/generic/AnnotationInfos.scala b/src/library/scala/reflect/generic/AnnotationInfos.scala
new file mode 100755
index 0000000000..cc6c909a45
--- /dev/null
+++ b/src/library/scala/reflect/generic/AnnotationInfos.scala
@@ -0,0 +1,50 @@
+package scala.reflect
+package generic
+
+trait AnnotationInfos { self: Universe =>
+
+ type AnnotationInfo <: AnyRef
+ val AnnotationInfo: AnnotationInfoExtractor
+
+ abstract class AnnotationInfoExtractor {
+ def apply(atp: Type, args: List[Tree], assocs: List[(Name, ClassfileAnnotArg)]): AnnotationInfo
+ def unapply(info: AnnotationInfo): Option[(Type, List[Tree], List[(Name, ClassfileAnnotArg)])]
+ }
+
+ type ClassfileAnnotArg <: AnyRef
+ implicit def classfileAnnotArgManifest: ClassManifest[ClassfileAnnotArg] // need a precise manifest to pass to UnPickle's toArray call
+
+ type LiteralAnnotArg <: ClassfileAnnotArg
+ val LiteralAnnotArg: LiteralAnnotArgExtractor
+
+ type ArrayAnnotArg <: ClassfileAnnotArg
+ val ArrayAnnotArg: ArrayAnnotArgExtractor
+
+ type ScalaSigBytes <: ClassfileAnnotArg
+ val ScalaSigBytes: ScalaSigBytesExtractor
+
+ type NestedAnnotArg <: ClassfileAnnotArg
+ val NestedAnnotArg: NestedAnnotArgExtractor
+
+ abstract class LiteralAnnotArgExtractor {
+ def apply(const: Constant): LiteralAnnotArg
+ def unapply(arg: LiteralAnnotArg): Option[Constant]
+ }
+
+ abstract class ArrayAnnotArgExtractor {
+ def apply(const: Array[ClassfileAnnotArg]): ArrayAnnotArg
+ def unapply(arg: ArrayAnnotArg): Option[Array[ClassfileAnnotArg]]
+ }
+
+ abstract class ScalaSigBytesExtractor {
+ def apply(bytes: Array[Byte]): ScalaSigBytes
+ def unapply(arg: ScalaSigBytes): Option[Array[Byte]]
+ }
+
+ abstract class NestedAnnotArgExtractor {
+ def apply(anninfo: AnnotationInfo): NestedAnnotArg
+ def unapply(arg: NestedAnnotArg): Option[AnnotationInfo]
+ }
+}
+
+
diff --git a/src/library/scala/reflect/generic/ByteCodecs.scala b/src/library/scala/reflect/generic/ByteCodecs.scala
new file mode 100644
index 0000000000..fd2e326e19
--- /dev/null
+++ b/src/library/scala/reflect/generic/ByteCodecs.scala
@@ -0,0 +1,209 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2007-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+package scala.reflect.generic
+
+object ByteCodecs {
+
+ def avoidZero(src: Array[Byte]): Array[Byte] = {
+ var i = 0
+ val srclen = src.length
+ var count = 0
+ while (i < srclen) {
+ if (src(i) == 0x7f) count += 1
+ i += 1
+ }
+ val dst = new Array[Byte](srclen + count)
+ i = 0
+ var j = 0
+ while (i < srclen) {
+ val in = src(i)
+ if (in == 0x7f) {
+ dst(j) = (0xc0).toByte
+ dst(j + 1) = (0x80).toByte
+ j += 2
+ } else {
+ dst(j) = (in + 1).toByte
+ j += 1
+ }
+ i += 1
+ }
+ dst
+ }
+
+ def regenerateZero(src: Array[Byte]): Int = {
+ var i = 0
+ val srclen = src.length
+ var j = 0
+ while (i < srclen) {
+ val in: Int = src(i) & 0xff
+ if (in == 0xc0 && (src(i + 1) & 0xff) == 0x80) {
+ src(j) = 0x7f
+ i += 2
+ } else {
+ src(j) = (in - 1).toByte
+ i += 1
+ }
+ j += 1
+ }
+ j
+ }
+
+ def encode8to7(src: Array[Byte]): Array[Byte] = {
+ val srclen = src.length
+ val dstlen = (srclen * 8 + 6) / 7
+ val dst = new Array[Byte](dstlen)
+ var i = 0
+ var j = 0
+ while (i + 6 < srclen) {
+ var in: Int = src(i) & 0xff
+ dst(j) = (in & 0x7f).toByte
+ var out: Int = in >>> 7
+ in = src(i + 1) & 0xff
+ dst(j + 1) = (out | (in << 1) & 0x7f).toByte
+ out = in >>> 6
+ in = src(i + 2) & 0xff
+ dst(j + 2) = (out | (in << 2) & 0x7f).toByte
+ out = in >>> 5
+ in = src(i + 3) & 0xff
+ dst(j + 3) = (out | (in << 3) & 0x7f).toByte
+ out = in >>> 4
+ in = src(i + 4) & 0xff
+ dst(j + 4) = (out | (in << 4) & 0x7f).toByte
+ out = in >>> 3
+ in = src(i + 5) & 0xff
+ dst(j + 5) = (out | (in << 5) & 0x7f).toByte
+ out = in >>> 2
+ in = src(i + 6) & 0xff
+ dst(j + 6) = (out | (in << 6) & 0x7f).toByte
+ out = in >>> 1
+ dst(j + 7) = out.toByte
+ i += 7
+ j += 8
+ }
+ if (i < srclen) {
+ var in: Int = src(i) & 0xff
+ dst(j) = (in & 0x7f).toByte; j += 1
+ var out: Int = in >>> 7
+ if (i + 1 < srclen) {
+ in = src(i + 1) & 0xff
+ dst(j) = (out | (in << 1) & 0x7f).toByte; j += 1
+ out = in >>> 6
+ if (i + 2 < srclen) {
+ in = src(i + 2) & 0xff
+ dst(j) = (out | (in << 2) & 0x7f).toByte; j += 1
+ out = in >>> 5
+ if (i + 3 < srclen) {
+ in = src(i + 3) & 0xff
+ dst(j) = (out | (in << 3) & 0x7f).toByte; j += 1
+ out = in >>> 4
+ if (i + 4 < srclen) {
+ in = src(i + 4) & 0xff
+ dst(j) = (out | (in << 4) & 0x7f).toByte; j += 1
+ out = in >>> 3
+ if (i + 5 < srclen) {
+ in = src(i + 5) & 0xff
+ dst(j) = (out | (in << 5) & 0x7f).toByte; j += 1
+ out = in >>> 2
+ }
+ }
+ }
+ }
+ }
+ if (j < dstlen) dst(j) = out.toByte
+ }
+ dst
+ }
+
+ @deprecated("use 2-argument version instead")
+ def decode7to8(src: Array[Byte], srclen: Int, dstlen: Int) { decode7to8(src, srclen) }
+
+ def decode7to8(src: Array[Byte], srclen: Int): Int = {
+ var i = 0
+ var j = 0
+ val dstlen = (srclen * 7 + 7) / 8
+ while (i + 7 < srclen) {
+ var out: Int = src(i)
+ var in: Byte = src(i + 1)
+ src(j) = (out | (in & 0x01) << 7).toByte
+ out = in >>> 1
+ in = src(i + 2)
+ src(j + 1) = (out | (in & 0x03) << 6).toByte
+ out = in >>> 2
+ in = src(i + 3)
+ src(j + 2) = (out | (in & 0x07) << 5).toByte
+ out = in >>> 3
+ in = src(i + 4)
+ src(j + 3) = (out | (in & 0x0f) << 4).toByte
+ out = in >>> 4
+ in = src(i + 5)
+ src(j + 4) = (out | (in & 0x1f) << 3).toByte
+ out = in >>> 5
+ in = src(i + 6)
+ src(j + 5) = (out | (in & 0x3f) << 2).toByte
+ out = in >>> 6
+ in = src(i + 7)
+ src(j + 6) = (out | in << 1).toByte
+ i += 8
+ j += 7
+ }
+ if (i < srclen) {
+ var out: Int = src(i)
+ if (i + 1 < srclen) {
+ var in: Byte = src(i + 1)
+ src(j) = (out | (in & 0x01) << 7).toByte; j += 1
+ out = in >>> 1
+ if (i + 2 < srclen) {
+ in = src(i + 2)
+ src(j) = (out | (in & 0x03) << 6).toByte; j += 1
+ out = in >>> 2
+ if (i + 3 < srclen) {
+ in = src(i + 3)
+ src(j) = (out | (in & 0x07) << 5).toByte; j += 1
+ out = in >>> 3
+ if (i + 4 < srclen) {
+ in = src(i + 4)
+ src(j) = (out | (in & 0x0f) << 4).toByte; j += 1
+ out = in >>> 4
+ if (i + 5 < srclen) {
+ in = src(i + 5)
+ src(j) = (out | (in & 0x1f) << 3).toByte; j += 1
+ out = in >>> 5
+ if (i + 6 < srclen) {
+ in = src(i + 6)
+ src(j) = (out | (in & 0x3f) << 2).toByte; j += 1
+ out = in >>> 6
+ }
+ }
+ }
+ }
+ }
+ }
+ if (j < dstlen) src(j) = out.toByte
+ }
+ dstlen
+ }
+
+ def encode(xs: Array[Byte]): Array[Byte] = avoidZero(encode8to7(xs))
+
+ @deprecated("use 1-argument version instead")
+ def decode(xs: Array[Byte], dstlen: Int) { decode(xs) }
+
+ /** Destructively decode array xs and returns the length of the decoded array */
+ def decode(xs: Array[Byte]): Int = {
+ val len = regenerateZero(xs)
+ decode7to8(xs, len)
+ }
+}
+
+
+
+
+
+
+
+
diff --git a/src/library/scala/reflect/generic/Constants.scala b/src/library/scala/reflect/generic/Constants.scala
new file mode 100755
index 0000000000..2fe9d24980
--- /dev/null
+++ b/src/library/scala/reflect/generic/Constants.scala
@@ -0,0 +1,236 @@
+/* NSC -- new Scala compiler
+ * Copyright 2005-2010 LAMP/EPFL
+ * @author Martin Odersky
+ */
+// $Id: Constants.scala 20028 2009-12-07 11:49:19Z cunei $
+
+package scala.reflect
+package generic
+
+import java.lang.Integer.toOctalString
+import PickleFormat._
+
+trait Constants { self: Universe =>
+
+ import definitions._
+
+ final val NoTag = LITERAL - LITERAL
+ final val UnitTag = LITERALunit - LITERAL
+ final val BooleanTag = LITERALboolean - LITERAL
+ final val ByteTag = LITERALbyte - LITERAL
+ final val ShortTag = LITERALshort - LITERAL
+ final val CharTag = LITERALchar - LITERAL
+ final val IntTag = LITERALint - LITERAL
+ final val LongTag = LITERALlong - LITERAL
+ final val FloatTag = LITERALfloat - LITERAL
+ final val DoubleTag = LITERALdouble - LITERAL
+ final val StringTag = LITERALstring - LITERAL
+ final val NullTag = LITERALnull - LITERAL
+ final val ClassTag = LITERALclass - LITERAL
+ // For supporting java enumerations inside java annotations (see ClassfileParser)
+ final val EnumTag = LITERALenum - LITERAL
+
+ case class Constant(value: Any) {
+
+ val tag: Int =
+ if (value.isInstanceOf[Unit]) UnitTag
+ else if (value.isInstanceOf[Boolean]) BooleanTag
+ else if (value.isInstanceOf[Byte]) ByteTag
+ else if (value.isInstanceOf[Short]) ShortTag
+ else if (value.isInstanceOf[Char]) CharTag
+ else if (value.isInstanceOf[Int]) IntTag
+ else if (value.isInstanceOf[Long]) LongTag
+ else if (value.isInstanceOf[Float]) FloatTag
+ else if (value.isInstanceOf[Double]) DoubleTag
+ else if (value.isInstanceOf[String]) StringTag
+ else if (value.isInstanceOf[AbsType]) ClassTag
+ else if (value.isInstanceOf[AbsSymbol]) EnumTag
+ else if (value == null) NullTag
+ else throw new Error("bad constant value: " + value)
+
+ def isNumeric: Boolean = ByteTag <= tag && tag <= DoubleTag
+
+ def tpe: Type = tag match {
+ case UnitTag => UnitClass.tpe
+ case BooleanTag => BooleanClass.tpe
+ case ByteTag => ByteClass.tpe
+ case ShortTag => ShortClass.tpe
+ case CharTag => CharClass.tpe
+ case IntTag => IntClass.tpe
+ case LongTag => LongClass.tpe
+ case FloatTag => FloatClass.tpe
+ case DoubleTag => DoubleClass.tpe
+ case StringTag => StringClass.tpe
+ case NullTag => NullClass.tpe
+ case ClassTag => ClassType(value.asInstanceOf[Type])
+ case EnumTag =>
+ // given (in java): "class A { enum E { VAL1 } }"
+ // - symbolValue: the symbol of the actual enumeration value (VAL1)
+ // - .owner: the ModuleClasSymbol of the enumeration (object E)
+ // - .linkedClassOfClass: the ClassSymbol of the enumeration (class E)
+ symbolValue.owner.linkedClassOfClass.tpe
+ }
+
+ /** We need the equals method to take account of tags as well as values.
+ *
+ * @param other ...
+ * @return ...
+ */
+ override def equals(other: Any): Boolean = other match {
+ case that: Constant =>
+ this.tag == that.tag &&
+ (this.value == that.value || this.isNaN && that.isNaN)
+ case _ => false
+ }
+
+ def isNaN = value match {
+ case f: Float => f.isNaN
+ case d: Double => d.isNaN
+ case _ => false
+ }
+
+ def booleanValue: Boolean =
+ if (tag == BooleanTag) value.asInstanceOf[Boolean]
+ else throw new Error("value " + value + " is not a boolean");
+
+ def byteValue: Byte = tag match {
+ case ByteTag => value.asInstanceOf[Byte]
+ case ShortTag => value.asInstanceOf[Short].toByte
+ case CharTag => value.asInstanceOf[Char].toByte
+ case IntTag => value.asInstanceOf[Int].toByte
+ case LongTag => value.asInstanceOf[Long].toByte
+ case FloatTag => value.asInstanceOf[Float].toByte
+ case DoubleTag => value.asInstanceOf[Double].toByte
+ case _ => throw new Error("value " + value + " is not a Byte")
+ }
+
+ def shortValue: Short = tag match {
+ case ByteTag => value.asInstanceOf[Byte].toShort
+ case ShortTag => value.asInstanceOf[Short]
+ case CharTag => value.asInstanceOf[Char].toShort
+ case IntTag => value.asInstanceOf[Int].toShort
+ case LongTag => value.asInstanceOf[Long].toShort
+ case FloatTag => value.asInstanceOf[Float].toShort
+ case DoubleTag => value.asInstanceOf[Double].toShort
+ case _ => throw new Error("value " + value + " is not a Short")
+ }
+
+ def charValue: Char = tag match {
+ case ByteTag => value.asInstanceOf[Byte].toChar
+ case ShortTag => value.asInstanceOf[Short].toChar
+ case CharTag => value.asInstanceOf[Char]
+ case IntTag => value.asInstanceOf[Int].toChar
+ case LongTag => value.asInstanceOf[Long].toChar
+ case FloatTag => value.asInstanceOf[Float].toChar
+ case DoubleTag => value.asInstanceOf[Double].toChar
+ case _ => throw new Error("value " + value + " is not a Char")
+ }
+
+ def intValue: Int = tag match {
+ case ByteTag => value.asInstanceOf[Byte].toInt
+ case ShortTag => value.asInstanceOf[Short].toInt
+ case CharTag => value.asInstanceOf[Char].toInt
+ case IntTag => value.asInstanceOf[Int]
+ case LongTag => value.asInstanceOf[Long].toInt
+ case FloatTag => value.asInstanceOf[Float].toInt
+ case DoubleTag => value.asInstanceOf[Double].toInt
+ case _ => throw new Error("value " + value + " is not an Int")
+ }
+
+ def longValue: Long = tag match {
+ case ByteTag => value.asInstanceOf[Byte].toLong
+ case ShortTag => value.asInstanceOf[Short].toLong
+ case CharTag => value.asInstanceOf[Char].toLong
+ case IntTag => value.asInstanceOf[Int].toLong
+ case LongTag => value.asInstanceOf[Long]
+ case FloatTag => value.asInstanceOf[Float].toLong
+ case DoubleTag => value.asInstanceOf[Double].toLong
+ case _ => throw new Error("value " + value + " is not a Long")
+ }
+
+ def floatValue: Float = tag match {
+ case ByteTag => value.asInstanceOf[Byte].toFloat
+ case ShortTag => value.asInstanceOf[Short].toFloat
+ case CharTag => value.asInstanceOf[Char].toFloat
+ case IntTag => value.asInstanceOf[Int].toFloat
+ case LongTag => value.asInstanceOf[Long].toFloat
+ case FloatTag => value.asInstanceOf[Float]
+ case DoubleTag => value.asInstanceOf[Double].toFloat
+ case _ => throw new Error("value " + value + " is not a Float")
+ }
+
+ def doubleValue: Double = tag match {
+ case ByteTag => value.asInstanceOf[Byte].toDouble
+ case ShortTag => value.asInstanceOf[Short].toDouble
+ case CharTag => value.asInstanceOf[Char].toDouble
+ case IntTag => value.asInstanceOf[Int].toDouble
+ case LongTag => value.asInstanceOf[Long].toDouble
+ case FloatTag => value.asInstanceOf[Float].toDouble
+ case DoubleTag => value.asInstanceOf[Double]
+ case _ => throw new Error("value " + value + " is not a Double")
+ }
+
+ /** Convert constant value to conform to given type.
+ *
+ * @param pt ...
+ * @return ...
+ */
+ def convertTo(pt: Type): Constant = {
+ val target = pt.typeSymbol
+ if (target == tpe.typeSymbol)
+ this
+ else if (target == ByteClass && ByteTag <= tag && tag <= IntTag &&
+ -128 <= intValue && intValue <= 127)
+ Constant(byteValue)
+ else if (target == ShortClass && ByteTag <= tag && tag <= IntTag &&
+ -32768 <= intValue && intValue <= 32767)
+ Constant(shortValue)
+ else if (target == CharClass && ByteTag <= tag && tag <= IntTag &&
+ 0 <= intValue && intValue <= 65635)
+ Constant(charValue)
+ else if (target == IntClass && ByteTag <= tag && tag <= IntTag)
+ Constant(intValue)
+ else if (target == LongClass && ByteTag <= tag && tag <= LongTag)
+ Constant(longValue)
+ else if (target == FloatClass && ByteTag <= tag && tag <= FloatTag)
+ Constant(floatValue)
+ else if (target == DoubleClass && ByteTag <= tag && tag <= DoubleTag)
+ Constant(doubleValue)
+ else {
+ null
+ }
+ }
+
+ def stringValue: String =
+ if (value == null) "null"
+ else if (tag == ClassTag) signature(typeValue)
+ else value.toString()
+
+ def escapedStringValue: String = {
+ def escape(text: String): String = {
+ val buf = new StringBuilder
+ for (c <- text.iterator)
+ if (c.isControl)
+ buf.append("\\0" + toOctalString(c.asInstanceOf[Int]))
+ else
+ buf.append(c)
+ buf.toString
+ }
+ tag match {
+ case NullTag => "null"
+ case StringTag => "\"" + escape(stringValue) + "\""
+ case ClassTag => "classOf[" + signature(typeValue) + "]"
+ case CharTag => escape("\'" + charValue + "\'")
+ case LongTag => longValue.toString() + "L"
+ case _ => value.toString()
+ }
+ }
+
+ def typeValue: Type = value.asInstanceOf[Type]
+
+ def symbolValue: Symbol = value.asInstanceOf[Symbol]
+
+ override def hashCode: Int =
+ if (value == null) 0 else value.hashCode() * 41 + 17
+ }
+}
diff --git a/src/library/scala/reflect/generic/Flags.scala b/src/library/scala/reflect/generic/Flags.scala
new file mode 100755
index 0000000000..f0f1f14ade
--- /dev/null
+++ b/src/library/scala/reflect/generic/Flags.scala
@@ -0,0 +1,198 @@
+package scala.reflect
+package generic
+
+object Flags extends Flags
+
+class Flags {
+
+ // modifiers
+ final val IMPLICIT = 0x00000200
+ final val FINAL = 0x00000020
+ final val PRIVATE = 0x00000004
+ final val PROTECTED = 0x00000001
+
+ final val SEALED = 0x00000400
+ final val OVERRIDE = 0x00000002
+ final val CASE = 0x00000800
+ final val ABSTRACT = 0x00000008 // abstract class, or used in conjunction
+ // with abstract override.
+ // Note difference to DEFERRED!
+
+ final val DEFERRED = 0x00000010 // was `abstract' for members | trait is virtual
+ final val METHOD = 0x00000040 // a method
+ final val MODULE = 0x00000100 // symbol is module or class implementing a module
+ final val INTERFACE = 0x00000080 // symbol is an interface (i.e. a trait which defines only abstract methods)
+
+ final val MUTABLE = 0x00001000 // symbol is a mutable variable.
+ final val PARAM = 0x00002000 // symbol is a (value or type) parameter to a method
+ final val PACKAGE = 0x00004000 // symbol is a java package
+ // available: 0x00008000
+
+ final val COVARIANT = 0x00010000 // symbol is a covariant type variable
+ final val CAPTURED = 0x00010000 // variable is accessed from nested function.
+ // Set by LambdaLift
+ final val BYNAMEPARAM = 0x00010000 // parameter is by name
+ final val CONTRAVARIANT = 0x00020000 // symbol is a contravariant type variable
+ final val LABEL = 0x00020000 // method symbol is a label. Set by TailCall
+ final val INCONSTRUCTOR = 0x00020000 // class symbol is defined in this/superclass
+ // constructor.
+ final val ABSOVERRIDE = 0x00040000 // combination of abstract & override
+ final val LOCAL = 0x00080000 // symbol is local to current class (i.e. private[this] or protected[this]
+ // pre: PRIVATE or PROTECTED are also set
+ final val JAVA = 0x00100000 // symbol was defined by a Java class
+ final val SYNTHETIC = 0x00200000 // symbol is compiler-generated
+ final val STABLE = 0x00400000 // functions that are assumed to be stable
+ // (typically, access methods for valdefs)
+ // or classes that do not contain abstract types.
+ final val STATIC = 0x00800000 // static field, method or class
+
+ final val CASEACCESSOR = 0x01000000 // symbol is a case parameter (or its accessor)
+ final val TRAIT = 0x02000000 // symbol is a trait
+ final val DEFAULTPARAM = 0x02000000 // the parameter has a default value
+ final val BRIDGE = 0x04000000 // function is a bridge method. Set by Erasure
+ final val ACCESSOR = 0x08000000 // a value or variable accessor (getter or setter)
+
+ final val SUPERACCESSOR = 0x10000000 // a super accessor
+ final val PARAMACCESSOR = 0x20000000 // for value definitions: is an access method
+ // for a final val parameter
+ // for parameters: is a val parameter
+ final val MODULEVAR = 0x40000000 // for variables: is the variable caching a module value
+ final val SYNTHETICMETH = 0x40000000 // for methods: synthetic method, but without SYNTHETIC flag
+ final val MONOMORPHIC = 0x40000000 // for type symbols: does not have type parameters
+ final val LAZY = 0x80000000L // symbol is a lazy val. can't have MUTABLE unless transformed by typer
+
+ final val IS_ERROR = 0x100000000L // symbol is an error symbol
+ final val OVERLOADED = 0x200000000L // symbol is overloaded
+ final val LIFTED = 0x400000000L // class has been lifted out to package level
+ // local value has been lifted out to class level
+ // todo: make LIFTED = latePRIVATE?
+ final val MIXEDIN = 0x800000000L // term member has been mixed in
+ final val EXISTENTIAL = 0x800000000L // type is an existential parameter or skolem
+
+ final val EXPANDEDNAME = 0x1000000000L // name has been expanded with class suffix
+ final val IMPLCLASS = 0x2000000000L // symbol is an implementation class
+ final val PRESUPER = 0x2000000000L // value is evaluated before super call
+ final val TRANS_FLAG = 0x4000000000L // transient flag guaranteed to be reset
+ // after each phase.
+
+ final val LOCKED = 0x8000000000L // temporary flag to catch cyclic dependencies
+ final val SPECIALIZED = 0x10000000000L// symbol is a generated specialized member
+ final val DEFAULTINIT = 0x20000000000L// symbol is a generated specialized member
+ final val VBRIDGE = 0x40000000000L// symbol is a varargs bridge
+
+ // pickling and unpickling of flags
+
+ // The flags from 0x001 to 0x800 are different in the raw flags
+ // and in the pickled format.
+
+ private final val IMPLICIT_PKL = 0x00000001
+ private final val FINAL_PKL = 0x00000002
+ private final val PRIVATE_PKL = 0x00000004
+ private final val PROTECTED_PKL = 0x00000008
+
+ private final val SEALED_PKL = 0x00000010
+ private final val OVERRIDE_PKL = 0x00000020
+ private final val CASE_PKL = 0x00000040
+ private final val ABSTRACT_PKL = 0x00000080
+
+ private final val DEFERRED_PKL = 0x00000100
+ private final val METHOD_PKL = 0x00000200
+ private final val MODULE_PKL = 0x00000400
+ private final val INTERFACE_PKL = 0x00000800
+
+ private final val PKL_MASK = 0x00000FFF
+
+ final val PickledFlags: Long = 0xFFFFFFFFL
+
+ private val r2p = {
+ def rawFlagsToPickledAux(flags:Int) = {
+ var pflags=0
+ if ((flags & IMPLICIT )!=0) pflags|=IMPLICIT_PKL
+ if ((flags & FINAL )!=0) pflags|=FINAL_PKL
+ if ((flags & PRIVATE )!=0) pflags|=PRIVATE_PKL
+ if ((flags & PROTECTED)!=0) pflags|=PROTECTED_PKL
+ if ((flags & SEALED )!=0) pflags|=SEALED_PKL
+ if ((flags & OVERRIDE )!=0) pflags|=OVERRIDE_PKL
+ if ((flags & CASE )!=0) pflags|=CASE_PKL
+ if ((flags & ABSTRACT )!=0) pflags|=ABSTRACT_PKL
+ if ((flags & DEFERRED )!=0) pflags|=DEFERRED_PKL
+ if ((flags & METHOD )!=0) pflags|=METHOD_PKL
+ if ((flags & MODULE )!=0) pflags|=MODULE_PKL
+ if ((flags & INTERFACE)!=0) pflags|=INTERFACE_PKL
+ pflags
+ }
+ val v=new Array[Int](PKL_MASK+1)
+ var i=0
+ while (i<=PKL_MASK) {
+ v(i)=rawFlagsToPickledAux(i)
+ i+=1
+ }
+ v
+ }
+
+ private val p2r = {
+ def pickledToRawFlagsAux(pflags:Int) = {
+ var flags=0
+ if ((pflags & IMPLICIT_PKL )!=0) flags|=IMPLICIT
+ if ((pflags & FINAL_PKL )!=0) flags|=FINAL
+ if ((pflags & PRIVATE_PKL )!=0) flags|=PRIVATE
+ if ((pflags & PROTECTED_PKL)!=0) flags|=PROTECTED
+ if ((pflags & SEALED_PKL )!=0) flags|=SEALED
+ if ((pflags & OVERRIDE_PKL )!=0) flags|=OVERRIDE
+ if ((pflags & CASE_PKL )!=0) flags|=CASE
+ if ((pflags & ABSTRACT_PKL )!=0) flags|=ABSTRACT
+ if ((pflags & DEFERRED_PKL )!=0) flags|=DEFERRED
+ if ((pflags & METHOD_PKL )!=0) flags|=METHOD
+ if ((pflags & MODULE_PKL )!=0) flags|=MODULE
+ if ((pflags & INTERFACE_PKL)!=0) flags|=INTERFACE
+ flags
+ }
+ val v=new Array[Int](PKL_MASK+1)
+ var i=0
+ while (i<=PKL_MASK) {
+ v(i)=pickledToRawFlagsAux(i)
+ i+=1
+ }
+ v
+ }
+
+ def rawFlagsToPickled(flags:Long):Long =
+ (flags & ~PKL_MASK) | r2p(flags.toInt & PKL_MASK)
+
+ def pickledToRawFlags(pflags:Long):Long =
+ (pflags & ~PKL_MASK) | p2r(pflags.toInt & PKL_MASK)
+
+ // List of the raw flags, in pickled order
+ protected val pickledListOrder = {
+ def findBit(m:Long):Int = {
+ var mask=m
+ var i=0
+ while (i <= 62) {
+ if ((mask&1) == 1L) return i
+ mask >>= 1
+ i += 1
+ }
+ throw new AssertionError()
+ }
+ val v=new Array[Long](63)
+ v(findBit(IMPLICIT_PKL ))=IMPLICIT
+ v(findBit(FINAL_PKL ))=FINAL
+ v(findBit(PRIVATE_PKL ))=PRIVATE
+ v(findBit(PROTECTED_PKL))=PROTECTED
+ v(findBit(SEALED_PKL ))=SEALED
+ v(findBit(OVERRIDE_PKL ))=OVERRIDE
+ v(findBit(CASE_PKL ))=CASE
+ v(findBit(ABSTRACT_PKL ))=ABSTRACT
+ v(findBit(DEFERRED_PKL ))=DEFERRED
+ v(findBit(METHOD_PKL ))=METHOD
+ v(findBit(MODULE_PKL ))=MODULE
+ v(findBit(INTERFACE_PKL))=INTERFACE
+ var i=findBit(PKL_MASK+1)
+ while (i <= 62) {
+ v(i)=1L << i
+ i += 1
+ }
+ v.toList
+ }
+
+}
diff --git a/src/library/scala/reflect/generic/Names.scala b/src/library/scala/reflect/generic/Names.scala
new file mode 100755
index 0000000000..1b31726e3a
--- /dev/null
+++ b/src/library/scala/reflect/generic/Names.scala
@@ -0,0 +1,21 @@
+package scala.reflect
+package generic
+
+trait Names {
+
+ type Name >: Null <: AnyRef
+
+ def newTermName(cs: Array[Char], offset: Int, len: Int): Name
+ def newTermName(cs: Array[Byte], offset: Int, len: Int): Name
+ def newTermName(s: String): Name
+
+ def mkTermName(name: Name): Name
+
+ def newTypeName(cs: Array[Char], offset: Int, len: Int): Name
+ def newTypeName(cs: Array[Byte], offset: Int, len: Int): Name
+ def newTypeName(s: String): Name
+
+ def mkTypeName(name: Name): Name
+}
+
+
diff --git a/src/library/scala/reflect/generic/PickleBuffer.scala b/src/library/scala/reflect/generic/PickleBuffer.scala
new file mode 100755
index 0000000000..2fab02bcfe
--- /dev/null
+++ b/src/library/scala/reflect/generic/PickleBuffer.scala
@@ -0,0 +1,188 @@
+/* NSC -- new Scala compiler
+ * Copyright 2005-2010 LAMP/EPFL
+ * @author Martin Odersky
+ */
+// $Id: PickleBuffer.scala 20028 2009-12-07 11:49:19Z cunei $
+
+package scala.reflect
+package generic
+
+/** Variable length byte arrays, with methods for basic pickling and unpickling.
+ *
+ * @param data The initial buffer
+ * @param from The first index where defined data are found
+ * @param to The first index where new data can be written
+ */
+class PickleBuffer(data: Array[Byte], from: Int, to: Int) {
+
+ var bytes = data
+ var readIndex = from
+ var writeIndex = to
+
+ /** Double bytes array */
+ private def dble() {
+ val bytes1 = new Array[Byte](bytes.length * 2)
+ Array.copy(bytes, 0, bytes1, 0, writeIndex)
+ bytes = bytes1
+ }
+
+ def ensureCapacity(capacity: Int) =
+ while (bytes.length < writeIndex + capacity) dble()
+
+ // -- Basic output routines --------------------------------------------
+
+ /** Write a byte of data */
+ def writeByte(b: Int) {
+ if (writeIndex == bytes.length) dble()
+ bytes(writeIndex) = b.toByte
+ writeIndex += 1
+ }
+
+ /** Write a natural number in big endian format, base 128.
+ * All but the last digits have bit 0x80 set.
+ */
+ def writeNat(x: Int) =
+ writeLongNat(x.toLong & 0x00000000FFFFFFFFL)
+
+ /**
+ * Like writeNat, but for longs. This is not the same as
+ * writeLong, which writes in base 256. Note that the
+ * binary representation of LongNat is identical to Nat
+ * if the long value is in the range Int.MIN_VALUE to
+ * Int.MAX_VALUE.
+ */
+ def writeLongNat(x: Long) {
+ def writeNatPrefix(x: Long) {
+ val y = x >>> 7
+ if (y != 0L) writeNatPrefix(y)
+ writeByte(((x & 0x7f) | 0x80).toInt)
+ }
+ val y = x >>> 7
+ if (y != 0L) writeNatPrefix(y)
+ writeByte((x & 0x7f).toInt)
+ }
+
+ /** Write a natural number <code>x</code> at position <code>pos</code>.
+ * If number is more than one byte, shift rest of array to make space.
+ *
+ * @param pos ...
+ * @param x ...
+ */
+ def patchNat(pos: Int, x: Int) {
+ def patchNatPrefix(x: Int) {
+ writeByte(0)
+ Array.copy(bytes, pos, bytes, pos+1, writeIndex - (pos+1))
+ bytes(pos) = ((x & 0x7f) | 0x80).toByte
+ val y = x >>> 7
+ if (y != 0) patchNatPrefix(y)
+ }
+ bytes(pos) = (x & 0x7f).toByte
+ val y = x >>> 7
+ if (y != 0) patchNatPrefix(y)
+ }
+
+ /** Write a long number <code>x</code> in signed big endian format, base 256.
+ *
+ * @param x The long number to be written.
+ */
+ def writeLong(x: Long) {
+ val y = x >> 8
+ val z = x & 0xff
+ if (-y != (z >> 7)) writeLong(y)
+ writeByte(z.toInt)
+ }
+
+ // -- Basic input routines --------------------------------------------
+
+ /** Peek at the current byte without moving the read index */
+ def peekByte(): Int = bytes(readIndex)
+
+ /** Read a byte */
+ def readByte(): Int = {
+ val x = bytes(readIndex); readIndex += 1; x
+ }
+
+ /** Read a natural number in big endian format, base 128.
+ * All but the last digits have bit 0x80 set.*/
+ def readNat(): Int = readLongNat().toInt
+
+ def readLongNat(): Long = {
+ var b = 0L
+ var x = 0L
+ do {
+ b = readByte()
+ x = (x << 7) + (b & 0x7f)
+ } while ((b & 0x80) != 0L);
+ x
+ }
+
+ /** Read a long number in signed big endian format, base 256. */
+ def readLong(len: Int): Long = {
+ var x = 0L
+ var i = 0
+ while (i < len) {
+ x = (x << 8) + (readByte() & 0xff)
+ i += 1
+ }
+ val leading = 64 - (len << 3)
+ x << leading >> leading
+ }
+
+ /** Returns the buffer as a sequence of (Int, Array[Byte]) representing
+ * (tag, data) of the individual entries. Saves and restores buffer state.
+ */
+
+ def toIndexedSeq: IndexedSeq[(Int, Array[Byte])] = {
+ val saved = readIndex
+ readIndex = 0
+ readNat() ; readNat() // discarding version
+ val result = new Array[(Int, Array[Byte])](readNat())
+
+ result.indices foreach { index =>
+ val tag = readNat()
+ val len = readNat()
+ val bytes = data.slice(readIndex, len + readIndex)
+ readIndex += len
+
+ result(index) = tag -> bytes
+ }
+
+ readIndex = saved
+ result.toIndexedSeq
+ }
+
+ /** Perform operation <code>op</code> until the condition
+ * <code>readIndex == end</code> is satisfied.
+ * Concatenate results into a list.
+ *
+ * @param end ...
+ * @param op ...
+ * @return ...
+ */
+ def until[T](end: Int, op: () => T): List[T] =
+ if (readIndex == end) List() else op() :: until(end, op);
+
+ /** Perform operation <code>op</code> the number of
+ * times specified. Concatenate the results into a list.
+ */
+ def times[T](n: Int, op: ()=>T): List[T] =
+ if (n == 0) List() else op() :: times(n-1, op)
+
+ /** Pickle = majorVersion_Nat minorVersion_Nat nbEntries_Nat {Entry}
+ * Entry = type_Nat length_Nat [actual entries]
+ *
+ * Assumes that the ..Version_Nat are already consumed.
+ *
+ * @return an array mapping entry numbers to locations in
+ * the byte array where the entries start.
+ */
+ def createIndex: Array[Int] = {
+ val index = new Array[Int](readNat()) // nbEntries_Nat
+ for (i <- 0 until index.length) {
+ index(i) = readIndex
+ readByte() // skip type_Nat
+ readIndex = readNat() + readIndex // read length_Nat, jump to next entry
+ }
+ index
+ }
+}
diff --git a/src/library/scala/reflect/generic/PickleFormat.scala b/src/library/scala/reflect/generic/PickleFormat.scala
new file mode 100755
index 0000000000..d1e884f513
--- /dev/null
+++ b/src/library/scala/reflect/generic/PickleFormat.scala
@@ -0,0 +1,223 @@
+package scala.reflect
+package generic
+
+/** This object provides constants for pickling attributes.
+ *
+ * If you extend the format, be sure to increase the
+ * version minor number.
+ *
+ * @author Martin Odersky
+ * @version 1.0
+ */
+object PickleFormat {
+
+/***************************************************
+ * Symbol table attribute format:
+ * Symtab = nentries_Nat {Entry}
+ * Entry = 1 TERMNAME len_Nat NameInfo
+ * | 2 TYPENAME len_Nat NameInfo
+ * | 3 NONEsym len_Nat
+ * | 4 TYPEsym len_Nat SymbolInfo
+ * | 5 ALIASsym len_Nat SymbolInfo
+ * | 6 CLASSsym len_Nat SymbolInfo [thistype_Ref]
+ * | 7 MODULEsym len_Nat SymbolInfo
+ * | 8 VALsym len_Nat [defaultGetter_Ref /* no longer needed*/] SymbolInfo [alias_Ref]
+ * | 9 EXTref len_Nat name_Ref [owner_Ref]
+ * | 10 EXTMODCLASSref len_Nat name_Ref [owner_Ref]
+ * | 11 NOtpe len_Nat
+ * | 12 NOPREFIXtpe len_Nat
+ * | 13 THIStpe len_Nat sym_Ref
+ * | 14 SINGLEtpe len_Nat type_Ref sym_Ref
+ * | 15 CONSTANTtpe len_Nat constant_Ref
+ * | 16 TYPEREFtpe len_Nat type_Ref sym_Ref {targ_Ref}
+ * | 17 TYPEBOUNDStpe len_Nat tpe_Ref tpe_Ref
+ * | 18 REFINEDtpe len_Nat classsym_Ref {tpe_Ref}
+ * | 19 CLASSINFOtpe len_Nat classsym_Ref {tpe_Ref}
+ * | 20 METHODtpe len_Nat tpe_Ref {sym_Ref}
+ * | 21 POLYTtpe len_Nat tpe_Ref {sym_Ref}
+ * | 22 IMPLICITMETHODtpe len_Nat tpe_Ref {sym_Ref} /* no longer needed */
+ * | 52 SUPERtpe len_Nat tpe_Ref tpe_Ref
+ * | 24 LITERALunit len_Nat
+ * | 25 LITERALboolean len_Nat value_Long
+ * | 26 LITERALbyte len_Nat value_Long
+ * | 27 LITERALshort len_Nat value_Long
+ * | 28 LITERALchar len_Nat value_Long
+ * | 29 LITERALint len_Nat value_Long
+ * | 30 LITERALlong len_Nat value_Long
+ * | 31 LITERALfloat len_Nat value_Long
+ * | 32 LITERALdouble len_Nat value_Long
+ * | 33 LITERALstring len_Nat name_Ref
+ * | 34 LITERALnull len_Nat
+ * | 35 LITERALclass len_Nat tpe_Ref
+ * | 36 LITERALenum len_Nat sym_Ref
+ * | 40 SYMANNOT len_Nat sym_Ref AnnotInfoBody
+ * | 41 CHILDREN len_Nat sym_Ref {sym_Ref}
+ * | 42 ANNOTATEDtpe len_Nat [sym_Ref /* no longer needed */] tpe_Ref {annotinfo_Ref}
+ * | 43 ANNOTINFO len_Nat AnnotInfoBody
+ * | 44 ANNOTARGARRAY len_Nat {constAnnotArg_Ref}
+ * | 47 DEBRUIJNINDEXtpe len_Nat level_Nat index_Nat
+ * | 48 EXISTENTIALtpe len_Nat type_Ref {symbol_Ref}
+ * | 49 TREE len_Nat 1 EMPTYtree
+ * | 49 TREE len_Nat 2 PACKAGEtree type_Ref sym_Ref mods_Ref name_Ref {tree_Ref}
+ * | 49 TREE len_Nat 3 CLASStree type_Ref sym_Ref mods_Ref name_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 4 MODULEtree type_Ref sym_Ref mods_Ref name_Ref tree_Ref
+ * | 49 TREE len_Nat 5 VALDEFtree type_Ref sym_Ref mods_Ref name_Ref tree_Ref tree_Ref
+ * | 49 TREE len_Nat 6 DEFDEFtree type_Ref sym_Ref mods_Ref name_Ref numtparams_Nat {tree_Ref} numparamss_Nat {numparams_Nat {tree_Ref}} tree_Ref tree_Ref
+ * | 49 TREE len_Nat 7 TYPEDEFtree type_Ref sym_Ref mods_Ref name_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 8 LABELtree type_Ref sym_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 9 IMPORTtree type_Ref sym_Ref tree_Ref {name_Ref name_Ref}
+ * | 49 TREE len_Nat 11 DOCDEFtree type_Ref sym_Ref string_Ref tree_Ref
+ * | 49 TREE len_Nat 12 TEMPLATEtree type_Ref sym_Ref numparents_Nat {tree_Ref} tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 13 BLOCKtree type_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 14 CASEtree type_Ref tree_Ref tree_Ref tree_Ref
+ * | 49 TREE len_Nat 15 SEQUENCEtree type_Ref {tree_Ref}
+ * | 49 TREE len_Nat 16 ALTERNATIVEtree type_Ref {tree_Ref}
+ * | 49 TREE len_Nat 17 STARtree type_Ref {tree_Ref}
+ * | 49 TREE len_Nat 18 BINDtree type_Ref sym_Ref name_Ref tree_Ref
+ * | 49 TREE len_Nat 19 UNAPPLYtree type_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 20 ARRAYVALUEtree type_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 21 FUNCTIONtree type_Ref sym_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 22 ASSIGNtree type_Ref tree_Ref tree_Ref
+ * | 49 TREE len_Nat 23 IFtree type_Ref tree_Ref tree_Ref tree_Ref
+ * | 49 TREE len_Nat 24 MATCHtree type_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 25 RETURNtree type_Ref sym_Ref tree_Ref
+ * | 49 TREE len_Nat 26 TREtree type_Ref tree_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 27 THROWtree type_Ref tree_Ref
+ * | 49 TREE len_Nat 28 NEWtree type_Ref tree_Ref
+ * | 49 TREE len_Nat 29 TYPEDtree type_Ref tree_Ref tree_Ref
+ * | 49 TREE len_Nat 30 TYPEAPPLYtree type_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 31 APPLYtree type_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 32 APPLYDYNAMICtree type_Ref sym_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 33 SUPERtree type_Ref sym_Ref tree_Ref name_Ref
+ * | 49 TREE len_Nat 34 THIStree type_Ref sym_Ref name_Ref
+ * | 49 TREE len_Nat 35 SELECTtree type_Ref sym_Ref tree_Ref name_Ref
+ * | 49 TREE len_Nat 36 IDENTtree type_Ref sym_Ref name_Ref
+ * | 49 TREE len_Nat 37 LITERALtree type_Ref constant_Ref
+ * | 49 TREE len_Nat 38 TYPEtree type_Ref
+ * | 49 TREE len_Nat 39 ANNOTATEDtree type_Ref tree_Ref tree_Ref
+ * | 49 TREE len_Nat 40 SINGLETONTYPEtree type_Ref tree_Ref
+ * | 49 TREE len_Nat 41 SELECTFROMTYPEtree type_Ref tree_Ref name_Ref
+ * | 49 TREE len_Nat 42 COMPOUNDTYPEtree type_Ref tree_Ref
+ * | 49 TREE len_Nat 43 APPLIEDTYPEtree type_Ref tree_Ref {tree_Ref}
+ * | 49 TREE len_Nat 44 TYPEBOUNDStree type_Ref tree_Ref tree_Ref
+ * | 49 TREE len_Nat 45 EXISTENTIALTYPEtree type_Ref tree_Ref {tree_Ref}
+ * | 50 MODIFIERS len_Nat flags_Long privateWithin_Ref
+ * SymbolInfo = name_Ref owner_Ref flags_LongNat [privateWithin_Ref] info_Ref
+ * NameInfo = <character sequence of length len_Nat in Utf8 format>
+ * NumInfo = <len_Nat-byte signed number in big endian format>
+ * Ref = Nat
+ * AnnotInfoBody = info_Ref {annotArg_Ref} {name_Ref constAnnotArg_Ref}
+ * AnnotArg = Tree | Constant
+ * ConstAnnotArg = Constant | AnnotInfo | AnnotArgArray
+ *
+ * len is remaining length after `len'.
+ */
+ val MajorVersion = 5
+ val MinorVersion = 0
+
+ final val TERMname = 1
+ final val TYPEname = 2
+ final val NONEsym = 3
+ final val TYPEsym = 4
+ final val ALIASsym = 5
+ final val CLASSsym = 6
+ final val MODULEsym = 7
+ final val VALsym = 8
+ final val EXTref = 9
+ final val EXTMODCLASSref = 10
+ final val NOtpe = 11
+ final val NOPREFIXtpe = 12
+ final val THIStpe = 13
+ final val SINGLEtpe = 14
+ final val CONSTANTtpe = 15
+ final val TYPEREFtpe = 16
+ final val TYPEBOUNDStpe = 17
+ final val REFINEDtpe = 18
+ final val CLASSINFOtpe = 19
+ final val METHODtpe = 20
+ final val POLYtpe = 21
+ final val IMPLICITMETHODtpe = 22
+
+ final val LITERAL = 23 // base line for literals
+ final val LITERALunit = 24
+ final val LITERALboolean = 25
+ final val LITERALbyte = 26
+ final val LITERALshort = 27
+ final val LITERALchar = 28
+ final val LITERALint = 29
+ final val LITERALlong = 30
+ final val LITERALfloat = 31
+ final val LITERALdouble = 32
+ final val LITERALstring = 33
+ final val LITERALnull = 34
+ final val LITERALclass = 35
+ final val LITERALenum = 36
+ final val SYMANNOT = 40
+ final val CHILDREN = 41
+ final val ANNOTATEDtpe = 42
+ final val ANNOTINFO = 43
+ final val ANNOTARGARRAY = 44
+
+ final val SUPERtpe = 46
+ final val DEBRUIJNINDEXtpe = 47
+ final val EXISTENTIALtpe = 48
+
+ final val TREE = 49 // prefix code that means a tree is coming
+ final val EMPTYtree = 1
+ final val PACKAGEtree = 2
+ final val CLASStree = 3
+ final val MODULEtree = 4
+ final val VALDEFtree = 5
+ final val DEFDEFtree = 6
+ final val TYPEDEFtree = 7
+ final val LABELtree = 8
+ final val IMPORTtree = 9
+ final val DOCDEFtree = 11
+ final val TEMPLATEtree = 12
+ final val BLOCKtree = 13
+ final val CASEtree = 14
+ // This node type has been removed.
+ // final val SEQUENCEtree = 15
+ final val ALTERNATIVEtree = 16
+ final val STARtree = 17
+ final val BINDtree = 18
+ final val UNAPPLYtree = 19
+ final val ARRAYVALUEtree = 20
+ final val FUNCTIONtree = 21
+ final val ASSIGNtree = 22
+ final val IFtree = 23
+ final val MATCHtree = 24
+ final val RETURNtree = 25
+ final val TREtree = 26
+ final val THROWtree = 27
+ final val NEWtree = 28
+ final val TYPEDtree = 29
+ final val TYPEAPPLYtree = 30
+ final val APPLYtree = 31
+ final val APPLYDYNAMICtree = 32
+ final val SUPERtree = 33
+ final val THIStree = 34
+ final val SELECTtree = 35
+ final val IDENTtree = 36
+ final val LITERALtree = 37
+ final val TYPEtree = 38
+ final val ANNOTATEDtree = 39
+ final val SINGLETONTYPEtree = 40
+ final val SELECTFROMTYPEtree = 41
+ final val COMPOUNDTYPEtree = 42
+ final val APPLIEDTYPEtree = 43
+ final val TYPEBOUNDStree = 44
+ final val EXISTENTIALTYPEtree = 45
+
+ final val MODIFIERS = 50
+
+ final val firstSymTag = NONEsym
+ final val lastSymTag = VALsym
+ final val lastExtSymTag = EXTMODCLASSref
+
+
+ //The following two are no longer accurate, because ANNOTATEDtpe,
+ //SUPERtpe, ... are not in the same range as the other types
+ //final val firstTypeTag = NOtpe
+ //final val lastTypeTag = POLYtpe
+}
diff --git a/src/library/scala/reflect/generic/Scopes.scala b/src/library/scala/reflect/generic/Scopes.scala
new file mode 100755
index 0000000000..9f8a8ecd19
--- /dev/null
+++ b/src/library/scala/reflect/generic/Scopes.scala
@@ -0,0 +1,15 @@
+package scala.reflect
+package generic
+
+trait Scopes { self: Universe =>
+
+ abstract class AbsScope extends Iterable[Symbol] {
+ def enter(sym: Symbol): Symbol
+ }
+
+ type Scope <: AbsScope
+
+ def newScope(): Scope
+}
+
+
diff --git a/src/library/scala/reflect/generic/StandardDefinitions.scala b/src/library/scala/reflect/generic/StandardDefinitions.scala
new file mode 100755
index 0000000000..24dce7173a
--- /dev/null
+++ b/src/library/scala/reflect/generic/StandardDefinitions.scala
@@ -0,0 +1,67 @@
+/* NSC -- new Scala compiler
+ * Copyright 2005-2010 LAMP/EPFL
+ * @author Martin Odersky
+ */
+// $Id: Definitions.scala 20619 2010-01-20 10:55:56Z rytz $
+
+package scala.reflect
+package generic
+
+trait StandardDefinitions { self: Universe =>
+
+ val definitions: AbsDefinitions
+
+ abstract class AbsDefinitions {
+
+ // outer packages and their classes
+ def RootPackage: Symbol
+ def RootClass: Symbol
+ def EmptyPackage: Symbol
+ def EmptyPackageClass: Symbol
+
+ def ScalaPackage: Symbol
+ def ScalaPackageClass: Symbol
+
+ // top types
+ def AnyClass : Symbol
+ def AnyValClass: Symbol
+ def AnyRefClass: Symbol
+ def ObjectClass: Symbol
+
+ // bottom types
+ def NullClass : Symbol
+ def NothingClass: Symbol
+
+ // the scala value classes
+ def UnitClass : Symbol
+ def ByteClass : Symbol
+ def ShortClass : Symbol
+ def CharClass : Symbol
+ def IntClass : Symbol
+ def LongClass : Symbol
+ def FloatClass : Symbol
+ def DoubleClass : Symbol
+ def BooleanClass: Symbol
+
+ // fundamental reference classes
+ def SymbolClass : Symbol
+ def StringClass : Symbol
+ def ClassClass : Symbol
+
+ // fundamental modules
+ def PredefModule: Symbol
+
+ // fundamental type constructions
+ def ClassType(arg: Type): Type
+
+ /** The string representation used by the given type in the VM.
+ */
+ def signature(tp: Type): String
+
+ /** Is symbol one of the value classes? */
+ def isValueClass(sym: Symbol): Boolean
+
+ /** Is symbol one of the numeric value classes? */
+ def isNumericValueClass(sym: Symbol): Boolean
+ }
+}
diff --git a/src/library/scala/reflect/generic/StdNames.scala b/src/library/scala/reflect/generic/StdNames.scala
new file mode 100755
index 0000000000..7a3b9169d8
--- /dev/null
+++ b/src/library/scala/reflect/generic/StdNames.scala
@@ -0,0 +1,26 @@
+package scala.reflect
+package generic
+
+trait StdNames { self: Universe =>
+
+ val nme: StandardNames
+
+ class StandardNames {
+ val EXPAND_SEPARATOR_STRING = "$$"
+ val LOCAL_SUFFIX_STRING = " "
+
+ val ANON_CLASS_NAME = newTermName("$anon")
+ val ANON_FUN_NAME = newTermName("$anonfun")
+ val EMPTY_PACKAGE_NAME = newTermName("<empty>")
+ val IMPORT = newTermName("<import>")
+ val REFINE_CLASS_NAME = newTermName("<refinement>")
+ val ROOT = newTermName("<root>")
+ val ROOTPKG = newTermName("_root_")
+ val EMPTY = newTermName("")
+
+ /** The expanded name of `name' relative to this class `base` with given `separator`
+ */
+ def expandedName(name: Name, base: Symbol, separator: String = EXPAND_SEPARATOR_STRING): Name =
+ newTermName(base.fullName('$') + separator + name)
+ }
+}
diff --git a/src/library/scala/reflect/generic/Symbols.scala b/src/library/scala/reflect/generic/Symbols.scala
new file mode 100755
index 0000000000..2f5e0624ab
--- /dev/null
+++ b/src/library/scala/reflect/generic/Symbols.scala
@@ -0,0 +1,194 @@
+package scala.reflect
+package generic
+
+import Flags._
+
+trait Symbols { self: Universe =>
+
+ type Symbol >: Null <: AbsSymbol
+
+ abstract class AbsSymbol { this: Symbol =>
+
+ /** The owner of this symbol.
+ */
+ def owner: Symbol
+
+ /** The flags of this symbol */
+ def flags: Long
+
+ /** The name of the symbol as a member of the `Name` type.
+ */
+ def name: Name
+
+ /** The name of the symbol before decoding, e.g. `$eq$eq` instead of `==`.
+ */
+ def encodedName: String
+
+ /** The decoded name of the symbol, e.g. `==` instead of `$eq$eq`.
+ */
+ def decodedName: String = stripLocalSuffix(NameTransformer.decode(encodedName))
+
+ /** The encoded full path name of this symbol, where outer names and inner names
+ * are separated by `separator` characters.
+ * Never translates expansions of operators back to operator symbol.
+ * Never adds id.
+ */
+ final def fullName(separator: Char): String = stripLocalSuffix {
+ if (isRoot || isRootPackage || this == NoSymbol) this.toString
+ else if (owner.isEffectiveRoot) encodedName
+ else owner.enclClass.fullName(separator) + separator + encodedName
+ }
+
+ private def stripLocalSuffix(s: String) = s stripSuffix nme.LOCAL_SUFFIX_STRING
+
+ /** The encoded full path name of this symbol, where outer names and inner names
+ * are separated by periods.
+ */
+ final def fullName: String = fullName('.')
+
+ /** Does symbol have ANY flag in `mask` set? */
+ final def hasFlag(mask: Long): Boolean = (flags & mask) != 0L
+
+ /** Does symbol have ALL the flags in `mask` set? */
+ final def hasAllFlags(mask: Long): Boolean = (flags & mask) == mask
+
+ /** Set when symbol has a modifier of the form private[X], NoSymbol otherwise.
+ */
+ def privateWithin: Symbol
+
+ /** The raw info of the type
+ */
+ def rawInfo: Type
+
+ /** The type of the symbol
+ */
+ def tpe: Type = info
+
+ /** The info of the symbol. This is like tpe, except for class symbols where the `info`
+ * describes the contents of the class whereas the `tpe` is a reference to the class.
+ */
+ def info: Type = {
+ val tp = rawInfo
+ tp.complete(this)
+ tp
+ }
+
+ /** If this symbol is a class or trait, its self type, otherwise the type of the symbol itse;lf
+ */
+ def typeOfThis: Type
+
+ def owner_=(sym: Symbol) { throw new UnsupportedOperationException("owner_= inapplicable for " + this) }
+ def flags_=(flags: Long) { throw new UnsupportedOperationException("flags_= inapplicable for " + this) }
+ def info_=(tp: Type) { throw new UnsupportedOperationException("info_= inapplicable for " + this) }
+ def typeOfThis_=(tp: Type) { throw new UnsupportedOperationException("typeOfThis_= inapplicable for " + this) }
+ def privateWithin_=(sym: Symbol) { throw new UnsupportedOperationException("privateWithin_= inapplicable for " + this) }
+ def sourceModule_=(sym: Symbol) { throw new UnsupportedOperationException("sourceModule_= inapplicable for " + this) }
+ def addChild(sym: Symbol) { throw new UnsupportedOperationException("addChild inapplicable for " + this) }
+ def addAnnotation(annot: AnnotationInfo) { throw new UnsupportedOperationException("addAnnotation inapplicable for " + this) }
+
+ /** For a module class its linked class, for a plain class
+ * the module class of its linked module.
+ * For instance
+ * object Foo
+ * class Foo
+ *
+ * Then object Foo has a `moduleClass' (invisible to the user, the backend calls it Foo$
+ * linkedClassOfClass goes from class Foo$ to class Foo, and back.
+ */
+ def linkedClassOfClass: Symbol
+
+ /** The module corresponding to this module class (note that this
+ * is not updated when a module is cloned), or NoSymbol if this is not a ModuleClass
+ */
+ def sourceModule: Symbol = NoSymbol
+
+ /** If symbol is an object definition, it's implied associated class,
+ * otherwise NoSymbol
+ */
+ def moduleClass: Symbol
+
+// flags and kind tests
+
+ def isTerm = false // to be overridden
+ def isType = false // to be overridden
+ def isClass = false // to be overridden
+ def isAliasType = false // to be overridden
+ def isAbstractType = false // to be overridden
+ private[scala] def isSkolem = false // to be overridden
+
+ def isTrait: Boolean = isClass && hasFlag(TRAIT) // refined later for virtual classes.
+ final def hasDefault = isParameter && hasFlag(DEFAULTPARAM)
+ final def isAbstractClass = isClass && hasFlag(ABSTRACT)
+ final def isAbstractOverride = isTerm && hasFlag(ABSTRACT) && hasFlag(OVERRIDE)
+ final def isBridge = hasFlag(BRIDGE)
+ final def isCase = hasFlag(CASE)
+ final def isCaseAccessor = hasFlag(CASEACCESSOR)
+ final def isContravariant = isType && hasFlag(CONTRAVARIANT)
+ final def isCovariant = isType && hasFlag(COVARIANT)
+ final def isDeferred = hasFlag(DEFERRED) && !isClass
+ final def isEarlyInitialized: Boolean = isTerm && hasFlag(PRESUPER)
+ final def isExistentiallyBound = isType && hasFlag(EXISTENTIAL)
+ final def isFinal = hasFlag(FINAL)
+ final def isGetterOrSetter = hasFlag(ACCESSOR)
+ final def isImplClass = isClass && hasFlag(IMPLCLASS) // Is this symbol an implementation class for a mixin?
+ final def isImplicit = hasFlag(IMPLICIT)
+ final def isInterface = hasFlag(INTERFACE)
+ final def isJavaDefined = hasFlag(JAVA)
+ final def isLazy = hasFlag(LAZY)
+ final def isMethod = isTerm && hasFlag(METHOD)
+ final def isModule = isTerm && hasFlag(MODULE)
+ final def isModuleClass = isClass && hasFlag(MODULE)
+ final def isMutable = hasFlag(MUTABLE)
+ final def isOverloaded = hasFlag(OVERLOADED)
+ final def isOverride = hasFlag(OVERRIDE)
+ final def isParamAccessor = hasFlag(PARAMACCESSOR)
+ final def isParameter = hasFlag(PARAM)
+ final def isRefinementClass = isClass && name == mkTypeName(nme.REFINE_CLASS_NAME)
+ final def isSealed = isClass && (hasFlag(SEALED) || definitions.isValueClass(this))
+ final def isSourceMethod = isTerm && (flags & (METHOD | STABLE)) == METHOD // exclude all accessors!!!
+ final def isSuperAccessor = hasFlag(SUPERACCESSOR)
+ final def isSynthetic = hasFlag(SYNTHETIC)
+ final def isTypeParameter = isType && isParameter && !isSkolem
+
+ /** Access tests */
+ final def isPrivate = hasFlag(PRIVATE)
+ final def isPrivateLocal = hasFlag(PRIVATE) && hasFlag(LOCAL)
+ final def isProtected = hasFlag(PROTECTED)
+ final def isProtectedLocal = hasFlag(PROTECTED) && hasFlag(LOCAL)
+ final def isPublic = !hasFlag(PRIVATE | PROTECTED) && privateWithin == NoSymbol
+
+ /** Package tests */
+ final def isEmptyPackage = isPackage && name == nme.EMPTY_PACKAGE_NAME
+ final def isEmptyPackageClass = isPackageClass && name == mkTypeName(nme.EMPTY_PACKAGE_NAME)
+ final def isPackage = isModule && hasFlag(PACKAGE)
+ final def isPackageClass = isClass && hasFlag(PACKAGE)
+ final def isRoot = isPackageClass && owner == NoSymbol
+ final def isRootPackage = isPackage && owner == NoSymbol
+
+ /** Is this symbol an effective root for fullname string?
+ */
+ def isEffectiveRoot = isRoot || isEmptyPackageClass
+
+ // creators
+
+ def newAbstractType(name: Name, pos: Position = NoPosition): Symbol
+ def newAliasType(name: Name, pos: Position = NoPosition): Symbol
+ def newClass(name: Name, pos: Position = NoPosition): Symbol
+ def newMethod(name: Name, pos: Position = NoPosition): Symbol
+ def newModule(name: Name, clazz: Symbol, pos: Position = NoPosition): Symbol
+ def newModuleClass(name: Name, pos: Position = NoPosition): Symbol
+ def newValue(name: Name, pos: Position = NoPosition): Symbol
+
+ // access to related symbols
+
+ /** The next enclosing class */
+ def enclClass: Symbol = if (isClass) this else owner.enclClass
+
+ /** The next enclosing method */
+ def enclMethod: Symbol = if (isSourceMethod) this else owner.enclMethod
+ }
+
+ val NoSymbol: Symbol
+}
+
+
diff --git a/src/library/scala/reflect/generic/Trees.scala b/src/library/scala/reflect/generic/Trees.scala
new file mode 100755
index 0000000000..df93d157e3
--- /dev/null
+++ b/src/library/scala/reflect/generic/Trees.scala
@@ -0,0 +1,738 @@
+package scala.reflect
+package generic
+
+import java.io.{PrintWriter, StringWriter}
+import Flags._
+
+trait Trees { self: Universe =>
+
+ abstract class AbsTreePrinter(out: PrintWriter) {
+ def print(tree: Tree)
+ def flush()
+ }
+
+ def newTreePrinter(out: PrintWriter): AbsTreePrinter
+
+ private[scala] var nodeCount = 0
+
+ /** @param privateWithin the qualifier for a private (a type name)
+ * or nme.EMPTY.toTypeName, if none is given.
+ * @param annotations the annotations for the definition.
+ * <strong>Note:</strong> the typechecker drops these annotations,
+ * use the AnnotationInfo's (Symbol.annotations) in later phases.
+ */
+ case class Modifiers(flags: Long, privateWithin: Name, annotations: List[Tree], positions: Map[Long, Position]) {
+ def isAbstract = hasFlag(ABSTRACT )
+ def isAccessor = hasFlag(ACCESSOR )
+ def isArgument = hasFlag(PARAM )
+ def isCase = hasFlag(CASE )
+ def isContravariant = hasFlag(CONTRAVARIANT) // marked with `-'
+ def isCovariant = hasFlag(COVARIANT ) // marked with `+'
+ def isDeferred = hasFlag(DEFERRED )
+ def isFinal = hasFlag(FINAL )
+ def isImplicit = hasFlag(IMPLICIT )
+ def isLazy = hasFlag(LAZY )
+ def isOverride = hasFlag(OVERRIDE )
+ def isPrivate = hasFlag(PRIVATE )
+ def isProtected = hasFlag(PROTECTED)
+ def isPublic = !isPrivate && !isProtected
+ def isSealed = hasFlag(SEALED )
+ def isTrait = hasFlag(TRAIT )
+ def isVariable = hasFlag(MUTABLE )
+
+ def hasFlag(flag: Long) = (flag & flags) != 0L
+ def & (flag: Long): Modifiers = {
+ val flags1 = flags & flag
+ if (flags1 == flags) this
+ else Modifiers(flags1, privateWithin, annotations, positions)
+ }
+ def &~ (flag: Long): Modifiers = {
+ val flags1 = flags & (~flag)
+ if (flags1 == flags) this
+ else Modifiers(flags1, privateWithin, annotations, positions)
+ }
+ def | (flag: Long): Modifiers = {
+ val flags1 = flags | flag
+ if (flags1 == flags) this
+ else Modifiers(flags1, privateWithin, annotations, positions)
+ }
+ def withAnnotations(annots: List[Tree]) =
+ if (annots.isEmpty) this
+ else copy(annotations = annotations ::: annots)
+ def withPosition(flag: Long, position: Position) =
+ copy(positions = positions + (flag -> position))
+ }
+
+ def Modifiers(flags: Long, privateWithin: Name): Modifiers = Modifiers(flags, privateWithin, List(), Map.empty)
+ def Modifiers(flags: Long): Modifiers = Modifiers(flags, mkTypeName(nme.EMPTY))
+
+ lazy val NoMods = Modifiers(0)
+
+ abstract class Tree extends Product {
+ val id = nodeCount
+ nodeCount += 1
+
+ private[this] var rawpos: Position = NoPosition
+
+ def pos = rawpos
+ def pos_=(pos: Position) = rawpos = pos
+ def setPos(pos: Position): this.type = { rawpos = pos; this }
+
+ private[this] var rawtpe: Type = _
+
+ def tpe = rawtpe
+ def tpe_=(t: Type) = rawtpe = t
+
+ /** Set tpe to give `tp` and return this.
+ */
+ def setType(tp: Type): this.type = { rawtpe = tp; this }
+
+ /** Like `setType`, but if this is a previously empty TypeTree
+ * that fact is remembered so that resetType will snap back.
+ */
+ def defineType(tp: Type): this.type = setType(tp)
+
+ def symbol: Symbol = null
+ def symbol_=(sym: Symbol) { throw new UnsupportedOperationException("symbol_= inapplicable for " + this) }
+ def setSymbol(sym: Symbol): this.type = { symbol = sym; this }
+
+ def hasSymbol = false
+ def isDef = false
+ def isEmpty = false
+
+ /** The direct child trees of this tree
+ * EmptyTrees are always omitted. Lists are collapsed.
+ */
+ def children: List[Tree] = {
+ def subtrees(x: Any): List[Tree] = x match {
+ case EmptyTree => List()
+ case t: Tree => List(t)
+ case xs: List[_] => xs flatMap subtrees
+ case _ => List()
+ }
+ productIterator.toList flatMap subtrees
+ }
+
+ /** In compiler: Make a copy of this tree, keeping all attributes,
+ * except that all positions are focussed (so nothing
+ * in this tree will be found when searching by position).
+ * If not in compiler may also return tree unchanged.
+ */
+ private[scala] def duplicate: this.type =
+ duplicateTree(this).asInstanceOf[this.type]
+
+ private[scala] def copyAttrs(tree: Tree): this.type = {
+ pos = tree.pos
+ tpe = tree.tpe
+ if (hasSymbol) symbol = tree.symbol
+ this
+ }
+
+ override def toString(): String = {
+ val buffer = new StringWriter()
+ val printer = newTreePrinter(new PrintWriter(buffer))
+ printer.print(this)
+ printer.flush()
+ buffer.toString
+ }
+
+ override def hashCode(): Int = super.hashCode()
+
+ override def equals(that: Any): Boolean = that match {
+ case t: Tree => this eq t
+ case _ => false
+ }
+ }
+
+ private[scala] def duplicateTree(tree: Tree): Tree = tree
+
+ trait SymTree extends Tree {
+ override def hasSymbol = true
+ override var symbol: Symbol = NoSymbol
+ }
+
+ trait RefTree extends SymTree {
+ def name: Name
+ }
+
+ abstract class DefTree extends SymTree {
+ def name: Name
+ override def isDef = true
+ }
+
+ trait TermTree extends Tree
+
+ /** A tree for a type. Note that not all type trees implement
+ * this trait; in particular, Ident's are an exception. */
+ trait TypTree extends Tree
+
+// ----- tree node alternatives --------------------------------------
+
+ /** The empty tree */
+ case object EmptyTree extends TermTree {
+ super.tpe_=(NoType)
+ override def tpe_=(t: Type) =
+ if (t != NoType) throw new UnsupportedOperationException("tpe_=("+t+") inapplicable for <empty>")
+ override def isEmpty = true
+ }
+
+ abstract class MemberDef extends DefTree {
+ def mods: Modifiers
+ def keyword: String = this match {
+ case TypeDef(_, _, _, _) => "type"
+ case ClassDef(mods, _, _, _) => if (mods.isTrait) "trait" else "class"
+ case DefDef(_, _, _, _, _, _) => "def"
+ case ModuleDef(_, _, _) => "object"
+ case PackageDef(_, _) => "package"
+ case ValDef(mods, _, _, _) => if (mods.isVariable) "var" else "val"
+ case _ => ""
+ }
+ final def hasFlag(mask: Long): Boolean = (mods.flags & mask) != 0L
+ }
+
+ /** Package clause
+ */
+ case class PackageDef(pid: RefTree, stats: List[Tree])
+ extends MemberDef {
+ def name = pid.name
+ def mods = NoMods
+ }
+
+ abstract class ImplDef extends MemberDef {
+ def impl: Template
+ }
+
+ /** Class definition */
+ case class ClassDef(mods: Modifiers, name: Name, tparams: List[TypeDef], impl: Template)
+ extends ImplDef
+
+ /** Singleton object definition
+ */
+ case class ModuleDef(mods: Modifiers, name: Name, impl: Template)
+ extends ImplDef
+
+ abstract class ValOrDefDef extends MemberDef {
+ def tpt: Tree
+ def rhs: Tree
+ }
+
+ /** Value definition
+ */
+ case class ValDef(mods: Modifiers, name: Name, tpt: Tree, rhs: Tree) extends ValOrDefDef
+
+ /** Method definition
+ */
+ case class DefDef(mods: Modifiers, name: Name, tparams: List[TypeDef],
+ vparamss: List[List[ValDef]], tpt: Tree, rhs: Tree) extends ValOrDefDef
+
+ /** Abstract type, type parameter, or type alias */
+ case class TypeDef(mods: Modifiers, name: Name, tparams: List[TypeDef], rhs: Tree)
+ extends MemberDef
+
+ /** <p>
+ * Labelled expression - the symbols in the array (must be Idents!)
+ * are those the label takes as argument
+ * </p>
+ * <p>
+ * The symbol that is given to the labeldef should have a MethodType
+ * (as if it were a nested function)
+ * </p>
+ * <p>
+ * Jumps are apply nodes attributed with label symbol, the arguments
+ * will get assigned to the idents.
+ * </p>
+ * <p>
+ * Note: on 2005-06-09 Martin, Iuli, Burak agreed to have forward
+ * jumps within a Block.
+ * </p>
+ */
+ case class LabelDef(name: Name, params: List[Ident], rhs: Tree)
+ extends DefTree with TermTree
+
+
+ /** Import selector
+ *
+ * Representation of an imported name its optional rename and their optional positions
+ *
+ * @param name the imported name
+ * @param namePos its position or -1 if undefined
+ * @param rename the name the import is renamed to (== name if no renaming)
+ * @param renamePos the position of the rename or -1 if undefined
+ */
+ case class ImportSelector(name: Name, namePos: Int, rename: Name, renamePos: Int)
+
+ /** Import clause
+ *
+ * @param expr
+ * @param selectors
+ */
+ case class Import(expr: Tree, selectors: List[ImportSelector])
+ extends SymTree
+ // The symbol of an Import is an import symbol @see Symbol.newImport
+ // It's used primarily as a marker to check that the import has been typechecked.
+
+ /** Instantiation template of a class or trait
+ *
+ * @param parents
+ * @param body
+ */
+ case class Template(parents: List[Tree], self: ValDef, body: List[Tree])
+ extends SymTree {
+ // the symbol of a template is a local dummy. @see Symbol.newLocalDummy
+ // the owner of the local dummy is the enclosing trait or class.
+ // the local dummy is itself the owner of any local blocks
+ // For example:
+ //
+ // class C {
+ // def foo // owner is C
+ // {
+ // def bar // owner is local dummy
+ // }
+ // System.err.println("TEMPLATE: " + parents)
+ }
+
+ /** Block of expressions (semicolon separated expressions) */
+ case class Block(stats: List[Tree], expr: Tree)
+ extends TermTree
+
+ /** Case clause in a pattern match, eliminated by TransMatch
+ * (except for occurrences in switch statements)
+ */
+ case class CaseDef(pat: Tree, guard: Tree, body: Tree)
+ extends Tree
+
+ /** Alternatives of patterns, eliminated by TransMatch, except for
+ * occurrences in encoded Switch stmt (=remaining Match(CaseDef(...))
+ */
+ case class Alternative(trees: List[Tree])
+ extends TermTree
+
+ /** Repetition of pattern, eliminated by TransMatch */
+ case class Star(elem: Tree)
+ extends TermTree
+
+ /** Bind of a variable to a rhs pattern, eliminated by TransMatch
+ *
+ * @param name
+ * @param body
+ */
+ case class Bind(name: Name, body: Tree)
+ extends DefTree
+
+ case class UnApply(fun: Tree, args: List[Tree])
+ extends TermTree
+
+ /** Array of expressions, needs to be translated in backend,
+ */
+ case class ArrayValue(elemtpt: Tree, elems: List[Tree])
+ extends TermTree
+
+ /** Anonymous function, eliminated by analyzer */
+ case class Function(vparams: List[ValDef], body: Tree)
+ extends TermTree with SymTree
+ // The symbol of a Function is a synthetic value of name nme.ANON_FUN_NAME
+ // It is the owner of the function's parameters.
+
+ /** Assignment */
+ case class Assign(lhs: Tree, rhs: Tree)
+ extends TermTree
+
+ /** Conditional expression */
+ case class If(cond: Tree, thenp: Tree, elsep: Tree)
+ extends TermTree
+
+ /** <p>
+ * Pattern matching expression (before <code>TransMatch</code>)
+ * Switch statements (after TransMatch)
+ * </p>
+ * <p>
+ * After <code>TransMatch</code>, cases will satisfy the following
+ * constraints:
+ * </p>
+ * <ul>
+ * <li>all guards are EmptyTree,</li>
+ * <li>all patterns will be either <code>Literal(Constant(x:Int))</code>
+ * or <code>Alternative(lit|...|lit)</code></li>
+ * <li>except for an "otherwise" branch, which has pattern
+ * <code>Ident(nme.WILDCARD)</code></li>
+ * </ul>
+ */
+ case class Match(selector: Tree, cases: List[CaseDef])
+ extends TermTree
+
+ /** Return expression */
+ case class Return(expr: Tree)
+ extends TermTree with SymTree
+ // The symbol of a Return node is the enclosing method.
+
+ case class Try(block: Tree, catches: List[CaseDef], finalizer: Tree)
+ extends TermTree
+
+ /** Throw expression */
+ case class Throw(expr: Tree)
+ extends TermTree
+
+ /** Object instantiation
+ * One should always use factory method below to build a user level new.
+ *
+ * @param tpt a class type
+ */
+ case class New(tpt: Tree) extends TermTree
+
+ /** Type annotation, eliminated by explicit outer */
+ case class Typed(expr: Tree, tpt: Tree)
+ extends TermTree
+
+ // Martin to Sean: Should GenericApply/TypeApply/Apply not be SymTree's? After all,
+ // ApplyDynamic is a SymTree.
+ abstract class GenericApply extends TermTree {
+ val fun: Tree
+ val args: List[Tree]
+ }
+
+ /** Type application */
+ case class TypeApply(fun: Tree, args: List[Tree])
+ extends GenericApply {
+ override def symbol: Symbol = fun.symbol
+ override def symbol_=(sym: Symbol) { fun.symbol = sym }
+ }
+
+ /** Value application */
+ case class Apply(fun: Tree, args: List[Tree])
+ extends GenericApply {
+ override def symbol: Symbol = fun.symbol
+ override def symbol_=(sym: Symbol) { fun.symbol = sym }
+ }
+
+ /** Dynamic value application.
+ * In a dynamic application q.f(as)
+ * - q is stored in qual
+ * - as is stored in args
+ * - f is stored as the node's symbol field.
+ */
+ case class ApplyDynamic(qual: Tree, args: List[Tree])
+ extends TermTree with SymTree
+ // The symbol of an ApplyDynamic is the function symbol of `qual', or NoSymbol, if there is none.
+
+ /** Super reference */
+ case class Super(qual: Name, mix: Name)
+ extends TermTree with SymTree
+ // The symbol of a Super is the class _from_ which the super reference is made.
+ // For instance in C.super(...), it would be C.
+
+ /** Self reference */
+ case class This(qual: Name)
+ extends TermTree with SymTree
+ // The symbol of a This is the class to which the this refers.
+ // For instance in C.this, it would be C.
+
+ /** Designator <qualifier> . <name> */
+ case class Select(qualifier: Tree, name: Name)
+ extends RefTree
+
+ /** Identifier <name> */
+ case class Ident(name: Name)
+ extends RefTree
+
+ class BackQuotedIdent(name: Name) extends Ident(name)
+
+ /** Literal */
+ case class Literal(value: Constant)
+ extends TermTree {
+ assert(value ne null)
+ }
+
+ def Literal(value: Any): Literal =
+ Literal(Constant(value))
+
+ type TypeTree <: AbsTypeTree
+ val TypeTree: TypeTreeExtractor
+
+ abstract class TypeTreeExtractor {
+ def apply(): TypeTree
+ def unapply(tree: TypeTree): Boolean
+ }
+
+ class Traverser {
+ protected var currentOwner: Symbol = definitions.RootClass
+ def traverse(tree: Tree): Unit = tree match {
+ case EmptyTree =>
+ ;
+ case PackageDef(pid, stats) =>
+ traverse(pid)
+ atOwner(tree.symbol.moduleClass) {
+ traverseTrees(stats)
+ }
+ case ClassDef(mods, name, tparams, impl) =>
+ atOwner(tree.symbol) {
+ traverseTrees(mods.annotations); traverseTrees(tparams); traverse(impl)
+ }
+ case ModuleDef(mods, name, impl) =>
+ atOwner(tree.symbol.moduleClass) {
+ traverseTrees(mods.annotations); traverse(impl)
+ }
+ case ValDef(mods, name, tpt, rhs) =>
+ atOwner(tree.symbol) {
+ traverseTrees(mods.annotations); traverse(tpt); traverse(rhs)
+ }
+ case DefDef(mods, name, tparams, vparamss, tpt, rhs) =>
+ atOwner(tree.symbol) {
+ traverseTrees(mods.annotations); traverseTrees(tparams); traverseTreess(vparamss); traverse(tpt); traverse(rhs)
+ }
+ case TypeDef(mods, name, tparams, rhs) =>
+ atOwner(tree.symbol) {
+ traverseTrees(mods.annotations); traverseTrees(tparams); traverse(rhs)
+ }
+ case LabelDef(name, params, rhs) =>
+ traverseTrees(params); traverse(rhs)
+ case Import(expr, selectors) =>
+ traverse(expr)
+ case Annotated(annot, arg) =>
+ traverse(annot); traverse(arg)
+ case Template(parents, self, body) =>
+ traverseTrees(parents)
+ if (!self.isEmpty) traverse(self)
+ traverseStats(body, tree.symbol)
+ case Block(stats, expr) =>
+ traverseTrees(stats); traverse(expr)
+ case CaseDef(pat, guard, body) =>
+ traverse(pat); traverse(guard); traverse(body)
+ case Alternative(trees) =>
+ traverseTrees(trees)
+ case Star(elem) =>
+ traverse(elem)
+ case Bind(name, body) =>
+ traverse(body)
+ case UnApply(fun, args) =>
+ traverse(fun); traverseTrees(args)
+ case ArrayValue(elemtpt, trees) =>
+ traverse(elemtpt); traverseTrees(trees)
+ case Function(vparams, body) =>
+ atOwner(tree.symbol) {
+ traverseTrees(vparams); traverse(body)
+ }
+ case Assign(lhs, rhs) =>
+ traverse(lhs); traverse(rhs)
+ case If(cond, thenp, elsep) =>
+ traverse(cond); traverse(thenp); traverse(elsep)
+ case Match(selector, cases) =>
+ traverse(selector); traverseTrees(cases)
+ case Return(expr) =>
+ traverse(expr)
+ case Try(block, catches, finalizer) =>
+ traverse(block); traverseTrees(catches); traverse(finalizer)
+ case Throw(expr) =>
+ traverse(expr)
+ case New(tpt) =>
+ traverse(tpt)
+ case Typed(expr, tpt) =>
+ traverse(expr); traverse(tpt)
+ case TypeApply(fun, args) =>
+ traverse(fun); traverseTrees(args)
+ case Apply(fun, args) =>
+ traverse(fun); traverseTrees(args)
+ case ApplyDynamic(qual, args) =>
+ traverse(qual); traverseTrees(args)
+ case Super(_, _) =>
+ ;
+ case This(_) =>
+ ;
+ case Select(qualifier, selector) =>
+ traverse(qualifier)
+ case Ident(_) =>
+ ;
+ case Literal(_) =>
+ ;
+ case TypeTree() =>
+ ;
+ case SingletonTypeTree(ref) =>
+ traverse(ref)
+ case SelectFromTypeTree(qualifier, selector) =>
+ traverse(qualifier)
+ case CompoundTypeTree(templ) =>
+ traverse(templ)
+ case AppliedTypeTree(tpt, args) =>
+ traverse(tpt); traverseTrees(args)
+ case TypeBoundsTree(lo, hi) =>
+ traverse(lo); traverse(hi)
+ case ExistentialTypeTree(tpt, whereClauses) =>
+ traverse(tpt); traverseTrees(whereClauses)
+ case SelectFromArray(qualifier, selector, erasure) =>
+ traverse(qualifier)
+ }
+
+ def traverseTrees(trees: List[Tree]) {
+ trees foreach traverse
+ }
+ def traverseTreess(treess: List[List[Tree]]) {
+ treess foreach traverseTrees
+ }
+ def traverseStats(stats: List[Tree], exprOwner: Symbol) {
+ stats foreach (stat =>
+ if (exprOwner != currentOwner) atOwner(exprOwner)(traverse(stat))
+ else traverse(stat)
+ )
+ }
+
+ def atOwner(owner: Symbol)(traverse: => Unit) {
+ val prevOwner = currentOwner
+ currentOwner = owner
+ traverse
+ currentOwner = prevOwner
+ }
+ }
+
+ /** A synthetic term holding an arbitrary type. Not to be confused with
+ * with TypTree, the trait for trees that are only used for type trees.
+ * TypeTree's are inserted in several places, but most notably in
+ * <code>RefCheck</code>, where the arbitrary type trees are all replaced by
+ * TypeTree's. */
+ abstract class AbsTypeTree extends TypTree {
+ override def symbol = if (tpe == null) null else tpe.typeSymbol
+ override def isEmpty = (tpe eq null) || tpe == NoType
+ }
+
+ /** A tree that has an annotation attached to it. Only used for annotated types and
+ * annotation ascriptions, annotations on definitions are stored in the Modifiers.
+ * Eliminated by typechecker (typedAnnotated), the annotations are then stored in
+ * an AnnotatedType.
+ */
+ case class Annotated(annot: Tree, arg: Tree) extends Tree
+
+ /** Singleton type, eliminated by RefCheck */
+ case class SingletonTypeTree(ref: Tree)
+ extends TypTree
+
+ /** Type selection <qualifier> # <name>, eliminated by RefCheck */
+ case class SelectFromTypeTree(qualifier: Tree, name: Name)
+ extends TypTree with RefTree
+
+ /** Intersection type <parent1> with ... with <parentN> { <decls> }, eliminated by RefCheck */
+ case class CompoundTypeTree(templ: Template)
+ extends TypTree
+
+ /** Applied type <tpt> [ <args> ], eliminated by RefCheck */
+ case class AppliedTypeTree(tpt: Tree, args: List[Tree])
+ extends TypTree {
+ override def symbol: Symbol = tpt.symbol
+ override def symbol_=(sym: Symbol) { tpt.symbol = sym }
+ }
+
+ case class TypeBoundsTree(lo: Tree, hi: Tree)
+ extends TypTree
+
+ case class ExistentialTypeTree(tpt: Tree, whereClauses: List[Tree])
+ extends TypTree
+
+ /** Array selection <qualifier> . <name> only used during erasure */
+ case class SelectFromArray(qualifier: Tree, name: Name, erasure: Type)
+ extends TermTree with RefTree { }
+
+/* A standard pattern match
+ case EmptyTree =>
+ case PackageDef(pid, stats) =>
+ // package pid { stats }
+ case ClassDef(mods, name, tparams, impl) =>
+ // mods class name [tparams] impl where impl = extends parents { defs }
+ case ModuleDef(mods, name, impl) => (eliminated by refcheck)
+ // mods object name impl where impl = extends parents { defs }
+ case ValDef(mods, name, tpt, rhs) =>
+ // mods val name: tpt = rhs
+ // note missing type information is expressed by tpt = TypeTree()
+ case DefDef(mods, name, tparams, vparamss, tpt, rhs) =>
+ // mods def name[tparams](vparams_1)...(vparams_n): tpt = rhs
+ // note missing type information is expressed by tpt = TypeTree()
+ case TypeDef(mods, name, tparams, rhs) => (eliminated by erasure)
+ // mods type name[tparams] = rhs
+ // mods type name[tparams] >: lo <: hi, where lo, hi are in a TypeBoundsTree,
+ and DEFERRED is set in mods
+ case LabelDef(name, params, rhs) =>
+ // used for tailcalls and like
+ // while/do are desugared to label defs as follows:
+ // while (cond) body ==> LabelDef($L, List(), if (cond) { body; L$() } else ())
+ // do body while (cond) ==> LabelDef($L, List(), body; if (cond) L$() else ())
+ case Import(expr, selectors) => (eliminated by typecheck)
+ // import expr.{selectors}
+ // Selectors are a list of pairs of names (from, to).
+ // The last (and maybe only name) may be a nme.WILDCARD
+ // for instance
+ // import qual.{x, y => z, _} would be represented as
+ // Import(qual, List(("x", "x"), ("y", "z"), (WILDCARD, null)))
+ case Template(parents, self, body) =>
+ // extends parents { self => body }
+ // if self is missing it is represented as emptyValDef
+ case Block(stats, expr) =>
+ // { stats; expr }
+ case CaseDef(pat, guard, body) => (eliminated by transmatch/explicitouter)
+ // case pat if guard => body
+ case Alternative(trees) => (eliminated by transmatch/explicitouter)
+ // pat1 | ... | patn
+ case Star(elem) => (eliminated by transmatch/explicitouter)
+ // pat*
+ case Bind(name, body) => (eliminated by transmatch/explicitouter)
+ // name @ pat
+ case UnApply(fun: Tree, args) (introduced by typer, eliminated by transmatch/explicitouter)
+ // used for unapply's
+ case ArrayValue(elemtpt, trees) => (introduced by uncurry)
+ // used to pass arguments to vararg arguments
+ // for instance, printf("%s%d", foo, 42) is translated to after uncurry to:
+ // Apply(
+ // Ident("printf"),
+ // Literal("%s%d"),
+ // ArrayValue(<Any>, List(Ident("foo"), Literal(42))))
+ case Function(vparams, body) => (eliminated by lambdaLift)
+ // vparams => body where vparams:List[ValDef]
+ case Assign(lhs, rhs) =>
+ // lhs = rhs
+ case If(cond, thenp, elsep) =>
+ // if (cond) thenp else elsep
+ case Match(selector, cases) =>
+ // selector match { cases }
+ case Return(expr) =>
+ // return expr
+ case Try(block, catches, finalizer) =>
+ // try block catch { catches } finally finalizer where catches: List[CaseDef]
+ case Throw(expr) =>
+ // throw expr
+ case New(tpt) =>
+ // new tpt always in the context: (new tpt).<init>[targs](args)
+ case Typed(expr, tpt) => (eliminated by erasure)
+ // expr: tpt
+ case TypeApply(fun, args) =>
+ // fun[args]
+ case Apply(fun, args) =>
+ // fun(args)
+ // for instance fun[targs](args) is expressed as Apply(TypeApply(fun, targs), args)
+ case ApplyDynamic(qual, args) (introduced by erasure, eliminated by cleanup)
+ // fun(args)
+ case Super(qual, mix) =>
+ // qual.super[mix] if qual and/or mix is empty, ther are nme.EMPTY.toTypeName
+ case This(qual) =>
+ // qual.this
+ case Select(qualifier, selector) =>
+ // qualifier.selector
+ case Ident(name) =>
+ // name
+ // note: type checker converts idents that refer to enclosing fields or methods
+ // to selects; name ==> this.name
+ case Literal(value) =>
+ // value
+ case TypeTree() => (introduced by refcheck)
+ // a type that's not written out, but given in the tpe attribute
+ case Annotated(annot, arg) => (eliminated by typer)
+ // arg @annot for types, arg: @annot for exprs
+ case SingletonTypeTree(ref) => (eliminated by uncurry)
+ // ref.type
+ case SelectFromTypeTree(qualifier, selector) => (eliminated by uncurry)
+ // qualifier # selector, a path-dependent type p.T is expressed as p.type # T
+ case CompoundTypeTree(templ: Template) => (eliminated by uncurry)
+ // parent1 with ... with parentN { refinement }
+ case AppliedTypeTree(tpt, args) => (eliminated by uncurry)
+ // tpt[args]
+ case TypeBoundsTree(lo, hi) => (eliminated by uncurry)
+ // >: lo <: hi
+ case ExistentialTypeTree(tpt, whereClauses) => (eliminated by uncurry)
+ // tpt forSome { whereClauses }
+
+*/
+}
diff --git a/src/library/scala/reflect/generic/Types.scala b/src/library/scala/reflect/generic/Types.scala
new file mode 100755
index 0000000000..17e19715d7
--- /dev/null
+++ b/src/library/scala/reflect/generic/Types.scala
@@ -0,0 +1,156 @@
+package scala.reflect
+package generic
+
+trait Types { self: Universe =>
+
+ abstract class AbsType {
+ def typeSymbol: Symbol
+ def decl(name: Name): Symbol
+
+ /** Is this type completed (i.e. not a lazy type)?
+ */
+ def isComplete: Boolean = true
+
+ /** If this is a lazy type, assign a new type to `sym'. */
+ def complete(sym: Symbol) {}
+
+ /** Convert toString avoiding infinite recursions by cutting off
+ * after `maxTostringRecursions` recursion levels. Uses `safeToString`
+ * to produce a string on each level.
+ */
+ override def toString: String =
+ if (tostringRecursions >= maxTostringRecursions)
+ "..."
+ else
+ try {
+ tostringRecursions += 1
+ safeToString
+ } finally {
+ tostringRecursions -= 1
+ }
+
+ /** Method to be implemented in subclasses.
+ * Converts this type to a string in calling toString for its parts.
+ */
+ def safeToString: String = super.toString
+ }
+
+ type Type >: Null <: AbsType
+
+ val NoType: Type
+ val NoPrefix: Type
+
+ type ThisType <: Type
+ val ThisType: ThisTypeExtractor
+
+ type TypeRef <: Type
+ val TypeRef: TypeRefExtractor
+
+ type SingleType <: Type
+ val SingleType: SingleTypeExtractor
+
+ type SuperType <: Type
+ val SuperType: SuperTypeExtractor
+
+ type TypeBounds <: Type
+ val TypeBounds: TypeBoundsExtractor
+
+ type CompoundType <: Type
+
+ type RefinedType <: CompoundType
+ val RefinedType: RefinedTypeExtractor
+
+ type ClassInfoType <: CompoundType
+ val ClassInfoType: ClassInfoTypeExtractor
+
+ type ConstantType <: Type
+ val ConstantType: ConstantTypeExtractor
+
+ type MethodType <: Type
+ val MethodType: MethodTypeExtractor
+
+ type PolyType <: Type
+ val PolyType: PolyTypeExtractor
+
+ type ExistentialType <: Type
+ val ExistentialType: ExistentialTypeExtractor
+
+ type AnnotatedType <: Type
+ val AnnotatedType: AnnotatedTypeExtractor
+
+ type LazyType <: Type with AbsLazyType
+
+ trait AbsLazyType extends AbsType {
+ override def isComplete: Boolean = false
+ override def complete(sym: Symbol)
+ override def safeToString = "<?>"
+ }
+
+ abstract class ThisTypeExtractor {
+ def apply(sym: Symbol): Type
+ def unapply(tpe: ThisType): Option[Symbol]
+ }
+
+ abstract class SingleTypeExtractor {
+ def apply(pre: Type, sym: Symbol): Type
+ def unapply(tpe: SingleType): Option[(Type, Symbol)]
+ }
+
+ abstract class SuperTypeExtractor {
+ def apply(thistpe: Type, supertpe: Type): Type
+ def unapply(tpe: SuperType): Option[(Type, Type)]
+ }
+
+ abstract class TypeRefExtractor {
+ def apply(pre: Type, sym: Symbol, args: List[Type]): Type
+ def unapply(tpe: TypeRef): Option[(Type, Symbol, List[Type])]
+ }
+
+ abstract class TypeBoundsExtractor {
+ def apply(lo: Type, hi: Type): TypeBounds
+ def unapply(tpe: TypeBounds): Option[(Type, Type)]
+ }
+
+ abstract class RefinedTypeExtractor {
+ def apply(parents: List[Type], decls: Scope): RefinedType
+ def apply(parents: List[Type], decls: Scope, clazz: Symbol): RefinedType
+ def unapply(tpe: RefinedType): Option[(List[Type], Scope)]
+ }
+
+ abstract class ClassInfoTypeExtractor {
+ def apply(parents: List[Type], decls: Scope, clazz: Symbol): ClassInfoType
+ def unapply(tpe: ClassInfoType): Option[(List[Type], Scope, Symbol)]
+ }
+
+ abstract class ConstantTypeExtractor {
+ def apply(value: Constant): ConstantType
+ def unapply(tpe: ConstantType): Option[Constant]
+ }
+
+ abstract class MethodTypeExtractor {
+ def apply(params: List[Symbol], resultType: Type): MethodType
+ def unapply(tpe: MethodType): Option[(List[Symbol], Type)]
+ }
+
+ abstract class PolyTypeExtractor {
+ def apply(typeParams: List[Symbol], resultType: Type): PolyType
+ def unapply(tpe: PolyType): Option[(List[Symbol], Type)]
+ }
+
+ abstract class ExistentialTypeExtractor {
+ def apply(quantified: List[Symbol], underlying: Type): ExistentialType
+ def unapply(tpe: ExistentialType): Option[(List[Symbol], Type)]
+ }
+
+ abstract class AnnotatedTypeExtractor {
+ def apply(annotations: List[AnnotationInfo], underlying: Type, selfsym: Symbol): AnnotatedType
+ def unapply(tpe: AnnotatedType): Option[(List[AnnotationInfo], Type, Symbol)]
+ }
+
+ /** The maximum number of recursions allowed in toString
+ */
+ final val maxTostringRecursions = 50
+
+ private var tostringRecursions = 0
+}
+
diff --git a/src/library/scala/reflect/generic/UnPickler.scala b/src/library/scala/reflect/generic/UnPickler.scala
new file mode 100755
index 0000000000..d7eef770cc
--- /dev/null
+++ b/src/library/scala/reflect/generic/UnPickler.scala
@@ -0,0 +1,775 @@
+/* NSC -- new Scala compiler
+ * Copyright 2005-2010 LAMP/EPFL
+ * @author Martin Odersky
+ */
+// $Id: UnPickler.scala 20716 2010-01-28 14:14:20Z rytz $
+
+package scala.reflect
+package generic
+
+import java.io.IOException
+import java.lang.{Float, Double}
+
+import Flags._
+import PickleFormat._
+import collection.mutable.{HashMap, ListBuffer}
+import annotation.switch
+
+/** @author Martin Odersky
+ * @version 1.0
+ */
+abstract class UnPickler {
+
+ val global: Universe
+ import global._
+
+ /** Unpickle symbol table information descending from a class and/or module root
+ * from an array of bytes.
+ * @param bytes bytearray from which we unpickle
+ * @param offset offset from which unpickling starts
+ * @param classroot the top-level class which is unpickled, or NoSymbol if inapplicable
+ * @param moduleroot the top-level module which is unpickled, or NoSymbol if inapplicable
+ * @param filename filename associated with bytearray, only used for error messages
+ */
+ def unpickle(bytes: Array[Byte], offset: Int, classRoot: Symbol, moduleRoot: Symbol, filename: String) {
+ try {
+ scan(bytes, offset, classRoot, moduleRoot, filename)
+ } catch {
+ case ex: IOException =>
+ throw ex
+ case ex: Throwable =>
+ /*if (settings.debug.value)*/ ex.printStackTrace()
+ throw new RuntimeException("error reading Scala signature of "+filename+": "+ex.getMessage())
+ }
+ }
+
+ /** To ne implemented in subclasses. Like `unpickle` but without the catch-all error handling.
+ */
+ def scan(bytes: Array[Byte], offset: Int, classRoot: Symbol, moduleRoot: Symbol, filename: String)
+
+ abstract class Scan(bytes: Array[Byte], offset: Int, classRoot: Symbol, moduleRoot: Symbol, filename: String) extends PickleBuffer(bytes, offset, -1) {
+ //println("unpickle " + classRoot + " and " + moduleRoot)//debug
+
+ protected def debug = false
+
+ checkVersion()
+
+ /** A map from entry numbers to array offsets */
+ private val index = createIndex
+
+ /** A map from entry numbers to symbols, types, or annotations */
+ private val entries = new Array[AnyRef](index.length)
+
+ /** A map from symbols to their associated `decls' scopes */
+ private val symScopes = new HashMap[Symbol, Scope]
+
+ //println("unpickled " + classRoot + ":" + classRoot.rawInfo + ", " + moduleRoot + ":" + moduleRoot.rawInfo);//debug
+
+ def run() {
+ for (i <- 0 until index.length) {
+ if (isSymbolEntry(i))
+ at(i, readSymbol)
+ else if (isSymbolAnnotationEntry(i))
+ at(i, {() => readSymbolAnnotation(); null})
+ else if (isChildrenEntry(i))
+ at(i, {() => readChildren(); null})
+ }
+ }
+
+ private def checkVersion() {
+ val major = readNat()
+ val minor = readNat()
+ if (major != MajorVersion || minor > MinorVersion)
+ throw new IOException("Scala signature " + classRoot.decodedName +
+ " has wrong version\n expected: " +
+ MajorVersion + "." + MinorVersion +
+ "\n found: " + major + "." + minor +
+ " in "+filename)
+ }
+
+ /** The `decls' scope associated with given symbol */
+ protected def symScope(sym: Symbol) = symScopes.get(sym) match {
+ case None => val s = newScope; symScopes(sym) = s; s
+ case Some(s) => s
+ }
+
+ /** Does entry represent an (internal) symbol */
+ protected def isSymbolEntry(i: Int): Boolean = {
+ val tag = bytes(index(i)).toInt
+ (firstSymTag <= tag && tag <= lastSymTag &&
+ (tag != CLASSsym || !isRefinementSymbolEntry(i)))
+ }
+
+ /** Does entry represent an (internal or external) symbol */
+ protected def isSymbolRef(i: Int): Boolean = {
+ val tag = bytes(index(i))
+ (firstSymTag <= tag && tag <= lastExtSymTag)
+ }
+
+ /** Does entry represent a name? */
+ protected def isNameEntry(i: Int): Boolean = {
+ val tag = bytes(index(i)).toInt
+ tag == TERMname || tag == TYPEname
+ }
+
+ /** Does entry represent a symbol annotation? */
+ protected def isSymbolAnnotationEntry(i: Int): Boolean = {
+ val tag = bytes(index(i)).toInt
+ tag == SYMANNOT
+ }
+
+ /** Does the entry represent children of a symbol? */
+ protected def isChildrenEntry(i: Int): Boolean = {
+ val tag = bytes(index(i)).toInt
+ tag == CHILDREN
+ }
+
+ /** Does entry represent a refinement symbol?
+ * pre: Entry is a class symbol
+ */
+ protected def isRefinementSymbolEntry(i: Int): Boolean = {
+ val savedIndex = readIndex
+ readIndex = index(i)
+ val tag = readByte().toInt
+ assert(tag == CLASSsym)
+
+ readNat(); // read length
+ val result = readNameRef() == mkTypeName(nme.REFINE_CLASS_NAME)
+ readIndex = savedIndex
+ result
+ }
+
+ /** If entry at <code>i</code> is undefined, define it by performing
+ * operation <code>op</code> with <code>readIndex at start of i'th
+ * entry. Restore <code>readIndex</code> afterwards.
+ */
+ protected def at[T <: AnyRef](i: Int, op: () => T): T = {
+ var r = entries(i)
+ if (r eq null) {
+ val savedIndex = readIndex
+ readIndex = index(i)
+ r = op()
+ assert(entries(i) eq null, entries(i))
+ entries(i) = r
+ readIndex = savedIndex
+ }
+ r.asInstanceOf[T]
+ }
+
+ /** Read a name */
+ protected def readName(): Name = {
+ val tag = readByte()
+ val len = readNat()
+ tag match {
+ case TERMname => newTermName(bytes, readIndex, len)
+ case TYPEname => newTypeName(bytes, readIndex, len)
+ case _ => errorBadSignature("bad name tag: " + tag)
+ }
+ }
+
+ /** Read a symbol */
+ protected def readSymbol(): Symbol = {
+ val tag = readByte()
+ val end = readNat() + readIndex
+ var sym: Symbol = NoSymbol
+ tag match {
+ case EXTref | EXTMODCLASSref =>
+ val name = readNameRef()
+ val owner = if (readIndex == end) definitions.RootClass else readSymbolRef()
+ def fromName(name: Name) =
+ if (mkTermName(name) == nme.ROOT) definitions.RootClass
+ else if (name == nme.ROOTPKG) definitions.RootPackage
+ else if (tag == EXTref) owner.info.decl(name)
+ else owner.info.decl(name).moduleClass
+ sym = fromName(name)
+ // If sym not found try with expanded name.
+ // This can happen if references to private symbols are
+ // read from outside; for instance when checking the children of a class
+ // (see t1722)
+ if (sym == NoSymbol) sym = fromName(nme.expandedName(name, owner))
+
+ // If the owner is overloaded (i.e. a method), it's not possible to select the
+ // right member => return NoSymbol. This can only happen when unpickling a tree.
+ // the "case Apply" in readTree() takes care of selecting the correct alternative
+ // after parsing the arguments.
+ if (sym == NoSymbol && !owner.isOverloaded) errorMissingRequirement(name, owner)
+
+ case NONEsym =>
+ sym = NoSymbol
+
+ case _ => // symbols that were pickled with Pickler.writeSymInfo
+ var defaultGetter: Symbol = NoSymbol // @deprecated, to be removed for 2.8 final
+ var nameref = readNat()
+ if (tag == VALsym && isSymbolRef(nameref)) { // @deprecated, to be removed for 2.8 final
+ defaultGetter = at(nameref, readSymbol)
+ nameref = readNat()
+ }
+ val name = at(nameref, readName)
+ val owner = readSymbolRef()
+ val flags = pickledToRawFlags(readLongNat())
+ var privateWithin: Symbol = NoSymbol
+ var inforef = readNat()
+ if (isSymbolRef(inforef)) {
+ privateWithin = at(inforef, readSymbol)
+ inforef = readNat()
+ }
+ tag match {
+ case TYPEsym =>
+ sym = owner.newAbstractType(name)
+ case ALIASsym =>
+ sym = owner.newAliasType(name)
+ case CLASSsym =>
+ sym =
+ if (name == classRoot.name && owner == classRoot.owner)
+ (if ((flags & MODULE) != 0L) moduleRoot.moduleClass
+ else classRoot)
+ else
+ if ((flags & MODULE) != 0L) owner.newModuleClass(name)
+ else owner.newClass(name)
+ if (readIndex != end) sym.typeOfThis = newLazyTypeRef(readNat())
+ case MODULEsym =>
+ val clazz = at(inforef, readType).typeSymbol
+ sym =
+ if (name == moduleRoot.name && owner == moduleRoot.owner) moduleRoot
+ else {
+ val m = owner.newModule(name, clazz)
+ clazz.sourceModule = m
+ m
+ }
+ case VALsym =>
+ sym = if (name == moduleRoot.name && owner == moduleRoot.owner) { assert(false); NoSymbol }
+ else if ((flags & METHOD) != 0) owner.newMethod(name)
+ else owner.newValue(name)
+ case _ =>
+ noSuchSymbolTag(tag, end, name, owner)
+ }
+ sym.flags = flags & PickledFlags
+ sym.privateWithin = privateWithin
+ if (readIndex != end) assert(sym hasFlag (SUPERACCESSOR | PARAMACCESSOR), sym)
+ if (sym hasFlag SUPERACCESSOR) assert(readIndex != end)
+ sym.info =
+ if (readIndex != end) newLazyTypeRefAndAlias(inforef, readNat())
+ else newLazyTypeRef(inforef)
+ if (sym.owner.isClass && sym != classRoot && sym != moduleRoot &&
+ !sym.isModuleClass && !sym.isRefinementClass && !sym.isTypeParameter && !sym.isExistentiallyBound)
+ symScope(sym.owner) enter sym
+ }
+ sym
+ }
+
+ def noSuchSymbolTag(tag: Int, end: Int, name: Name, owner: Symbol) =
+ errorBadSignature("bad symbol tag: " + tag)
+
+ /** Read a type */
+ protected def readType(): Type = {
+ val tag = readByte()
+ val end = readNat() + readIndex
+ (tag: @switch) match {
+ case NOtpe =>
+ NoType
+ case NOPREFIXtpe =>
+ NoPrefix
+ case THIStpe =>
+ ThisType(readSymbolRef())
+ case SINGLEtpe =>
+ SingleType(readTypeRef(), readSymbolRef()) // !!! was singleType
+ case SUPERtpe =>
+ val thistpe = readTypeRef()
+ val supertpe = readTypeRef()
+ SuperType(thistpe, supertpe)
+ case CONSTANTtpe =>
+ ConstantType(readConstantRef())
+ case TYPEREFtpe =>
+ val pre = readTypeRef()
+ val sym = readSymbolRef()
+ var args = until(end, readTypeRef)
+ TypeRef(pre, sym, args)
+ case TYPEBOUNDStpe =>
+ TypeBounds(readTypeRef(), readTypeRef())
+ case REFINEDtpe =>
+ val clazz = readSymbolRef()
+ RefinedType(until(end, readTypeRef), symScope(clazz), clazz)
+ case CLASSINFOtpe =>
+ val clazz = readSymbolRef()
+ ClassInfoType(until(end, readTypeRef), symScope(clazz), clazz)
+ case METHODtpe | IMPLICITMETHODtpe =>
+ val restpe = readTypeRef()
+ val params = until(end, readSymbolRef)
+ // if the method is overloaded, the params cannot be determined (see readSymbol) => return NoType.
+ // Only happen for trees, "case Apply" in readTree() takes care of selecting the correct
+ // alternative after parsing the arguments.
+ if (params.contains(NoSymbol) || restpe == NoType) NoType
+ else MethodType(params, restpe)
+ case POLYtpe =>
+ val restpe = readTypeRef()
+ val typeParams = until(end, readSymbolRef)
+ PolyType(typeParams, restpe)
+ case EXISTENTIALtpe =>
+ val restpe = readTypeRef()
+ ExistentialType(until(end, readSymbolRef), restpe)
+ case ANNOTATEDtpe =>
+ var typeRef = readNat()
+ val selfsym = if (isSymbolRef(typeRef)) {
+ val s = at(typeRef, readSymbol)
+ typeRef = readNat()
+ s
+ } else NoSymbol // selfsym can go.
+ val tp = at(typeRef, readType)
+ val annots = until(end, readAnnotationRef)
+ if (selfsym == NoSymbol) AnnotatedType(annots, tp, selfsym)
+ else tp
+ case _ =>
+ noSuchTypeTag(tag, end)
+ }
+ }
+
+ def noSuchTypeTag(tag: Int, end: Int): Type =
+ errorBadSignature("bad type tag: " + tag)
+
+ /** Read a constant */
+ protected def readConstant(): Constant = {
+ val tag = readByte().toInt
+ val len = readNat()
+ (tag: @switch) match {
+ case LITERALunit => Constant(())
+ case LITERALboolean => Constant(readLong(len) != 0L)
+ case LITERALbyte => Constant(readLong(len).toByte)
+ case LITERALshort => Constant(readLong(len).toShort)
+ case LITERALchar => Constant(readLong(len).toChar)
+ case LITERALint => Constant(readLong(len).toInt)
+ case LITERALlong => Constant(readLong(len))
+ case LITERALfloat => Constant(Float.intBitsToFloat(readLong(len).toInt))
+ case LITERALdouble => Constant(Double.longBitsToDouble(readLong(len)))
+ case LITERALstring => Constant(readNameRef().toString())
+ case LITERALnull => Constant(null)
+ case LITERALclass => Constant(readTypeRef())
+ case LITERALenum => Constant(readSymbolRef())
+ case _ => noSuchConstantTag(tag, len)
+ }
+ }
+
+ def noSuchConstantTag(tag: Int, len: Int): Constant =
+ errorBadSignature("bad constant tag: " + tag)
+
+ /** Read children and store them into the corresponding symbol.
+ */
+ protected def readChildren() {
+ val tag = readByte()
+ assert(tag == CHILDREN)
+ val end = readNat() + readIndex
+ val target = readSymbolRef()
+ while (readIndex != end) target addChild readSymbolRef()
+ }
+
+ /** Read an annotation argument, which is pickled either
+ * as a Constant or a Tree.
+ */
+ protected def readAnnotArg(i: Int): Tree = {
+ if (bytes(index(i)) == TREE) {
+ at(i, readTree)
+ } else {
+ val const = at(i, readConstant)
+ global.Literal(const).setType(const.tpe)
+ }
+ }
+
+ /** Read a ClassfileAnnotArg (argument to a classfile annotation)
+ */
+ protected def readClassfileAnnotArg(i: Int): ClassfileAnnotArg = bytes(index(i)) match {
+ case ANNOTINFO =>
+ NestedAnnotArg(at(i, readAnnotation))
+ case ANNOTARGARRAY =>
+ at(i, () => {
+ readByte() // skip the `annotargarray` tag
+ val end = readNat() + readIndex
+ ArrayAnnotArg(until(end, () => readClassfileAnnotArg(readNat())).toArray(classfileAnnotArgManifest))
+ })
+ case _ =>
+ LiteralAnnotArg(at(i, readConstant))
+ }
+
+ /** Read an AnnotationInfo. Not to be called directly, use
+ * readAnnotation or readSymbolAnnotation
+ */
+ protected def readAnnotationInfo(end: Int): AnnotationInfo = {
+ val atp = readTypeRef()
+ val args = new ListBuffer[Tree]
+ val assocs = new ListBuffer[(Name, ClassfileAnnotArg)]
+ while (readIndex != end) {
+ val argref = readNat()
+ if (isNameEntry(argref)) {
+ val name = at(argref, readName)
+ val arg = readClassfileAnnotArg(readNat())
+ assocs += ((name, arg))
+ }
+ else
+ args += readAnnotArg(argref)
+ }
+ AnnotationInfo(atp, args.toList, assocs.toList)
+ }
+
+ /** Read an annotation and as a side effect store it into
+ * the symbol it requests. Called at top-level, for all
+ * (symbol, annotInfo) entries. */
+ protected def readSymbolAnnotation() {
+ val tag = readByte()
+ if (tag != SYMANNOT)
+ errorBadSignature("symbol annotation expected ("+ tag +")")
+ val end = readNat() + readIndex
+ val target = readSymbolRef()
+ target.addAnnotation(readAnnotationInfo(end))
+ }
+
+ /** Read an annotation and return it. Used when unpickling
+ * an ANNOTATED(WSELF)tpe or a NestedAnnotArg */
+ protected def readAnnotation(): AnnotationInfo = {
+ val tag = readByte()
+ if (tag != ANNOTINFO)
+ errorBadSignature("annotation expected (" + tag + ")")
+ val end = readNat() + readIndex
+ readAnnotationInfo(end)
+ }
+
+ /* Read an abstract syntax tree */
+ protected def readTree(): Tree = {
+ val outerTag = readByte()
+ if (outerTag != TREE)
+ errorBadSignature("tree expected (" + outerTag + ")")
+ val end = readNat() + readIndex
+ val tag = readByte()
+ val tpe = if (tag == EMPTYtree) NoType else readTypeRef()
+
+ // Set by the three functions to follow. If symbol is non-null
+ // after the the new tree 't' has been created, t has its Symbol
+ // set to symbol; and it always has its Type set to tpe.
+ var symbol: Symbol = null
+ var mods: Modifiers = null
+ var name: Name = null
+
+ /** Read a Symbol, Modifiers, and a Name */
+ def setSymModsName() {
+ symbol = readSymbolRef()
+ mods = readModifiersRef()
+ name = readNameRef()
+ }
+ /** Read a Symbol and a Name */
+ def setSymName() {
+ symbol = readSymbolRef()
+ name = readNameRef()
+ }
+ /** Read a Symbol */
+ def setSym() {
+ symbol = readSymbolRef()
+ }
+
+ val t = tag match {
+ case EMPTYtree =>
+ EmptyTree
+
+ case PACKAGEtree =>
+ setSym()
+ // val discardedSymbol = readSymbolRef() // XXX is symbol intentionally not set?
+ val pid = readTreeRef().asInstanceOf[RefTree]
+ val stats = until(end, readTreeRef)
+ PackageDef(pid, stats)
+
+ case CLASStree =>
+ setSymModsName()
+ val impl = readTemplateRef()
+ val tparams = until(end, readTypeDefRef)
+ ClassDef(mods, name, tparams, impl)
+
+ case MODULEtree =>
+ setSymModsName()
+ ModuleDef(mods, name, readTemplateRef())
+
+ case VALDEFtree =>
+ setSymModsName()
+ val tpt = readTreeRef()
+ val rhs = readTreeRef()
+ ValDef(mods, name, tpt, rhs)
+
+ case DEFDEFtree =>
+ setSymModsName()
+ val tparams = times(readNat(), readTypeDefRef)
+ val vparamss = times(readNat(), () => times(readNat(), readValDefRef))
+ val tpt = readTreeRef()
+ val rhs = readTreeRef()
+
+ DefDef(mods, name, tparams, vparamss, tpt, rhs)
+
+ case TYPEDEFtree =>
+ setSymModsName()
+ val rhs = readTreeRef()
+ val tparams = until(end, readTypeDefRef)
+ TypeDef(mods, name, tparams, rhs)
+
+ case LABELtree =>
+ setSymName()
+ val rhs = readTreeRef()
+ val params = until(end, readIdentRef)
+ LabelDef(name, params, rhs)
+
+ case IMPORTtree =>
+ setSym()
+ val expr = readTreeRef()
+ val selectors = until(end, () => {
+ val from = readNameRef()
+ val to = readNameRef()
+ ImportSelector(from, -1, to, -1)
+ })
+
+ Import(expr, selectors)
+
+ case TEMPLATEtree =>
+ setSym()
+ val parents = times(readNat(), readTreeRef)
+ val self = readValDefRef()
+ val body = until(end, readTreeRef)
+
+ Template(parents, self, body)
+
+ case BLOCKtree =>
+ val expr = readTreeRef()
+ val stats = until(end, readTreeRef)
+ Block(stats, expr)
+
+ case CASEtree =>
+ val pat = readTreeRef()
+ val guard = readTreeRef()
+ val body = readTreeRef()
+ CaseDef(pat, guard, body)
+
+ case ALTERNATIVEtree =>
+ Alternative(until(end, readTreeRef))
+
+ case STARtree =>
+ Star(readTreeRef())
+
+ case BINDtree =>
+ setSymName()
+ Bind(name, readTreeRef())
+
+ case UNAPPLYtree =>
+ val fun = readTreeRef()
+ val args = until(end, readTreeRef)
+ UnApply(fun, args)
+
+ case ARRAYVALUEtree =>
+ val elemtpt = readTreeRef()
+ val trees = until(end, readTreeRef)
+ ArrayValue(elemtpt, trees)
+
+ case FUNCTIONtree =>
+ setSym()
+ val body = readTreeRef()
+ val vparams = until(end, readValDefRef)
+ Function(vparams, body)
+
+ case ASSIGNtree =>
+ val lhs = readTreeRef()
+ val rhs = readTreeRef()
+ Assign(lhs, rhs)
+
+ case IFtree =>
+ val cond = readTreeRef()
+ val thenp = readTreeRef()
+ val elsep = readTreeRef()
+ If(cond, thenp, elsep)
+
+ case MATCHtree =>
+ val selector = readTreeRef()
+ val cases = until(end, readCaseDefRef)
+ Match(selector, cases)
+
+ case RETURNtree =>
+ setSym()
+ Return(readTreeRef())
+
+ case TREtree =>
+ val block = readTreeRef()
+ val finalizer = readTreeRef()
+ val catches = until(end, readCaseDefRef)
+ Try(block, catches, finalizer)
+
+ case THROWtree =>
+ Throw(readTreeRef())
+
+ case NEWtree =>
+ New(readTreeRef())
+
+ case TYPEDtree =>
+ val expr = readTreeRef()
+ val tpt = readTreeRef()
+ Typed(expr, tpt)
+
+ case TYPEAPPLYtree =>
+ val fun = readTreeRef()
+ val args = until(end, readTreeRef)
+ TypeApply(fun, args)
+
+ case APPLYtree =>
+ val fun = readTreeRef()
+ val args = until(end, readTreeRef)
+ if (fun.symbol.isOverloaded) {
+ fun.setType(fun.symbol.info)
+ inferMethodAlternative(fun, args map (_.tpe), tpe)
+ }
+ Apply(fun, args)
+
+ case APPLYDYNAMICtree =>
+ setSym()
+ val qual = readTreeRef()
+ val args = until(end, readTreeRef)
+ ApplyDynamic(qual, args)
+
+ case SUPERtree =>
+ setSym()
+ val qual = readNameRef()
+ val mix = readNameRef()
+ Super(qual, mix)
+
+ case THIStree =>
+ setSym()
+ This(readNameRef())
+
+ case SELECTtree =>
+ setSym()
+ val qualifier = readTreeRef()
+ val selector = readNameRef()
+ Select(qualifier, selector)
+
+ case IDENTtree =>
+ setSymName()
+ Ident(name)
+
+ case LITERALtree =>
+ global.Literal(readConstantRef())
+
+ case TYPEtree =>
+ TypeTree()
+
+ case ANNOTATEDtree =>
+ val annot = readTreeRef()
+ val arg = readTreeRef()
+ Annotated(annot, arg)
+
+ case SINGLETONTYPEtree =>
+ SingletonTypeTree(readTreeRef())
+
+ case SELECTFROMTYPEtree =>
+ val qualifier = readTreeRef()
+ val selector = readNameRef()
+ SelectFromTypeTree(qualifier, selector)
+
+ case COMPOUNDTYPEtree =>
+ CompoundTypeTree(readTemplateRef())
+
+ case APPLIEDTYPEtree =>
+ val tpt = readTreeRef()
+ val args = until(end, readTreeRef)
+ AppliedTypeTree(tpt, args)
+
+ case TYPEBOUNDStree =>
+ val lo = readTreeRef()
+ val hi = readTreeRef()
+ TypeBoundsTree(lo, hi)
+
+ case EXISTENTIALTYPEtree =>
+ val tpt = readTreeRef()
+ val whereClauses = until(end, readTreeRef)
+ ExistentialTypeTree(tpt, whereClauses)
+
+ case _ =>
+ noSuchTreeTag(tag, end)
+ }
+
+ if (symbol == null) t setType tpe
+ else t setSymbol symbol setType tpe
+ }
+
+ def noSuchTreeTag(tag: Int, end: Int) =
+ errorBadSignature("unknown tree type (" + tag + ")")
+
+ def readModifiers(): Modifiers = {
+ val tag = readNat()
+ if (tag != MODIFIERS)
+ errorBadSignature("expected a modifiers tag (" + tag + ")")
+ val end = readNat() + readIndex
+ val pflagsHi = readNat()
+ val pflagsLo = readNat()
+ val pflags = (pflagsHi.toLong << 32) + pflagsLo
+ val flags = pickledToRawFlags(pflags)
+ val privateWithin = readNameRef()
+ Modifiers(flags, privateWithin, Nil, Map.empty)
+ }
+
+ /* Read a reference to a pickled item */
+ protected def readNameRef(): Name = at(readNat(), readName)
+ protected def readSymbolRef(): Symbol = at(readNat(), readSymbol)
+ protected def readTypeRef(): Type = at(readNat(), readType)
+ protected def readConstantRef(): Constant = at(readNat(), readConstant)
+ protected def readAnnotationRef(): AnnotationInfo =
+ at(readNat(), readAnnotation)
+ protected def readModifiersRef(): Modifiers =
+ at(readNat(), readModifiers)
+ protected def readTreeRef(): Tree =
+ at(readNat(), readTree)
+
+ protected def readTemplateRef(): Template =
+ readTreeRef() match {
+ case templ:Template => templ
+ case other =>
+ errorBadSignature("expected a template (" + other + ")")
+ }
+ protected def readCaseDefRef(): CaseDef =
+ readTreeRef() match {
+ case tree:CaseDef => tree
+ case other =>
+ errorBadSignature("expected a case def (" + other + ")")
+ }
+ protected def readValDefRef(): ValDef =
+ readTreeRef() match {
+ case tree:ValDef => tree
+ case other =>
+ errorBadSignature("expected a ValDef (" + other + ")")
+ }
+ protected def readIdentRef(): Ident =
+ readTreeRef() match {
+ case tree:Ident => tree
+ case other =>
+ errorBadSignature("expected an Ident (" + other + ")")
+ }
+ protected def readTypeDefRef(): TypeDef =
+ readTreeRef() match {
+ case tree:TypeDef => tree
+ case other =>
+ errorBadSignature("expected an TypeDef (" + other + ")")
+ }
+
+ protected def errorBadSignature(msg: String) =
+ throw new RuntimeException("malformed Scala signature of " + classRoot.name + " at " + readIndex + "; " + msg)
+
+ protected def errorMissingRequirement(msg: String): Nothing =
+ if (debug) errorBadSignature(msg)
+ else throw new IOException("class file needed by "+classRoot.name+" is missing.\n"+msg)
+
+ protected def errorMissingRequirement(name: Name, owner: Symbol): Nothing =
+ errorMissingRequirement("reference " + NameTransformer.decode(name.toString) + " of " + owner.tpe + " refers to nonexisting symbol.")
+
+ /** pre: `fun` points to a symbol with an overloaded type.
+ * Selects the overloaded alternative of `fun` which best matches given
+ * argument types `argtpes` and result type `restpe`. Stores this alternative as
+ * the symbol of `fun`.
+ */
+ def inferMethodAlternative(fun: Tree, argtpes: List[Type], restpe: Type)
+
+ /** Create a lazy type which when completed returns type at index `i`. */
+ def newLazyTypeRef(i: Int): LazyType
+
+ /** Create a lazy type which when completed returns type at index `i` and sets alias
+ * of completed symbol to symbol at index `j`
+ */
+ def newLazyTypeRefAndAlias(i: Int, j: Int): LazyType
+ }
+}
diff --git a/src/library/scala/reflect/generic/Universe.scala b/src/library/scala/reflect/generic/Universe.scala
new file mode 100755
index 0000000000..101295ae79
--- /dev/null
+++ b/src/library/scala/reflect/generic/Universe.scala
@@ -0,0 +1,16 @@
+package scala.reflect
+package generic
+
+abstract class Universe extends Symbols
+ with Types
+ with Constants
+ with Scopes
+ with Names
+ with StdNames
+ with Trees
+ with AnnotationInfos
+ with StandardDefinitions {
+ type Position
+ val NoPosition: Position
+}
+
diff --git a/src/library/scala/runtime/BoxesRunTime.java b/src/library/scala/runtime/BoxesRunTime.java
index 087331e1c5..7c27835b5a 100644
--- a/src/library/scala/runtime/BoxesRunTime.java
+++ b/src/library/scala/runtime/BoxesRunTime.java
@@ -28,7 +28,7 @@ import scala.math.ScalaNumber;
* @author Martin Odersky
* @contributor Stepan Koltsov
* @version 2.0 */
-public class BoxesRunTime
+public final class BoxesRunTime
{
private static final int CHAR = 0, BYTE = 1, SHORT = 2, INT = 3, LONG = 4, FLOAT = 5, DOUBLE = 6, OTHER = 7;
@@ -136,38 +136,51 @@ public class BoxesRunTime
* in any case, we dispatch to it as soon as we spot one on either side.
*/
public static boolean equals2(Object x, Object y) {
- if (x instanceof Number) {
- Number xn = (Number)x;
-
- if (y instanceof Number) {
- Number yn = (Number)y;
- int xcode = eqTypeCode(xn);
- int ycode = eqTypeCode(yn);
- switch (ycode > xcode ? ycode : xcode) {
- case INT:
- return xn.intValue() == yn.intValue();
- case LONG:
- return xn.longValue() == yn.longValue();
- case FLOAT:
- return xn.floatValue() == yn.floatValue();
- case DOUBLE:
- return xn.doubleValue() == yn.doubleValue();
- default:
- if ((yn instanceof ScalaNumber) && !(xn instanceof ScalaNumber))
- return y.equals(x);
- }
- } else if (y instanceof Character)
- return equalsNumChar(xn, (Character)y);
- } else if (x instanceof Character) {
- Character xc = (Character)x;
- if (y instanceof Character)
- return xc.charValue() == ((Character)y).charValue();
- if (y instanceof Number)
- return equalsNumChar((Number)y, xc);
- }
+ if (x instanceof Number)
+ return equalsNumObject((Number)x, y);
+ if (x instanceof Character)
+ return equalsCharObject((Character)x, y);
+
return x.equals(y);
}
+ public static boolean equalsNumObject(Number xn, Object y) {
+ if (y instanceof Number)
+ return equalsNumNum(xn, (Number)y);
+ else if (y instanceof Character)
+ return equalsNumChar(xn, (Character)y);
+
+ return xn.equals(y);
+ }
+
+ public static boolean equalsNumNum(Number xn, Number yn) {
+ int xcode = eqTypeCode(xn);
+ int ycode = eqTypeCode(yn);
+ switch (ycode > xcode ? ycode : xcode) {
+ case INT:
+ return xn.intValue() == yn.intValue();
+ case LONG:
+ return xn.longValue() == yn.longValue();
+ case FLOAT:
+ return xn.floatValue() == yn.floatValue();
+ case DOUBLE:
+ return xn.doubleValue() == yn.doubleValue();
+ default:
+ if ((yn instanceof ScalaNumber) && !(xn instanceof ScalaNumber))
+ return yn.equals(xn);
+ }
+ return xn.equals(yn);
+ }
+
+ public static boolean equalsCharObject(Character xc, Object y) {
+ if (y instanceof Character)
+ return xc.charValue() == ((Character)y).charValue();
+ if (y instanceof Number)
+ return equalsNumChar((Number)y, xc);
+
+ return xc.equals(y);
+ }
+
private static boolean equalsNumChar(Number xn, Character yc) {
char ch = yc.charValue();
switch (eqTypeCode(xn)) {
@@ -212,27 +225,27 @@ public class BoxesRunTime
* verisons are equal. This still needs reconciliation.
*/
public static int hashFromLong(Long n) {
- int iv = n.intValue();
- if (iv == n.longValue()) return iv;
- else return n.hashCode();
+ int iv = n.intValue();
+ if (iv == n.longValue()) return iv;
+ else return n.hashCode();
}
public static int hashFromDouble(Double n) {
- int iv = n.intValue();
- double dv = n.doubleValue();
- if (iv == dv) return iv;
+ int iv = n.intValue();
+ double dv = n.doubleValue();
+ if (iv == dv) return iv;
- long lv = n.longValue();
- if (lv == dv) return Long.valueOf(lv).hashCode();
- else return n.hashCode();
+ long lv = n.longValue();
+ if (lv == dv) return Long.valueOf(lv).hashCode();
+ else return n.hashCode();
}
public static int hashFromFloat(Float n) {
- int iv = n.intValue();
- float fv = n.floatValue();
- if (iv == fv) return iv;
+ int iv = n.intValue();
+ float fv = n.floatValue();
+ if (iv == fv) return iv;
- long lv = n.longValue();
- if (lv == fv) return Long.valueOf(lv).hashCode();
- else return n.hashCode();
+ long lv = n.longValue();
+ if (lv == fv) return Long.valueOf(lv).hashCode();
+ else return n.hashCode();
}
public static int hashFromNumber(Number n) {
if (n instanceof Long) return hashFromLong((Long)n);
diff --git a/src/library/scala/runtime/NonLocalReturnControl.scala b/src/library/scala/runtime/NonLocalReturnControl.scala
new file mode 100644
index 0000000000..5591d4871b
--- /dev/null
+++ b/src/library/scala/runtime/NonLocalReturnControl.scala
@@ -0,0 +1,16 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2002-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+// $Id$
+
+
+package scala.runtime
+
+import scala.util.control.ControlThrowable
+
+class NonLocalReturnControl[T](val key: AnyRef, val value: T) extends ControlThrowable
diff --git a/src/library/scala/runtime/NonLocalReturnException.scala b/src/library/scala/runtime/NonLocalReturnException.scala
index 4bd8ceb058..19a216be7c 100644
--- a/src/library/scala/runtime/NonLocalReturnException.scala
+++ b/src/library/scala/runtime/NonLocalReturnException.scala
@@ -11,6 +11,9 @@
package scala.runtime
-import scala.util.control.ControlException
+import scala.util.control.ControlThrowable
-class NonLocalReturnException[T](val key: AnyRef, val value: T) extends RuntimeException with ControlException
+/** !!! This class has been replaced by NonLocalReturnControl and should be deleted.
+ * But, it can't be deleted until starr is updated to use the new name.
+ */
+class NonLocalReturnException[T](val key: AnyRef, val value: T) extends ControlThrowable
diff --git a/src/library/scala/runtime/RichChar.scala b/src/library/scala/runtime/RichChar.scala
index 63fbe20f3c..f5e2625fd8 100644
--- a/src/library/scala/runtime/RichChar.scala
+++ b/src/library/scala/runtime/RichChar.scala
@@ -12,7 +12,7 @@
package scala.runtime
import java.lang.Character
-import collection.{IndexedSeq, IndexedSeqView}
+import collection.immutable.NumericRange
/** <p>
* For example, in the following code
@@ -82,22 +82,14 @@ final class RichChar(x: Char) extends Proxy with Ordered[Char] {
@deprecated("Use ch.isUpper instead")
def isUpperCase: Boolean = isUpper
- /** Create a <code>[Char]</code> over the characters from 'x' to 'y' - 1
+ /** Create a <code>[Char]</code> over the characters from 'x' to 'limit' - 1
*/
- def until(limit: Char): IndexedSeqView[Char, IndexedSeq[Char]] =
- if (limit <= x) IndexedSeq.empty.view
- else
- new IndexedSeqView[Char, IndexedSeq[Char]] {
- protected def underlying = IndexedSeq.empty[Char]
- def length = limit - x
- def apply(i: Int): Char = {
- require(i >= 0 && i < length)
- (x + i).toChar
- }
- }
-
- /** Create a <code>IndexedSeqView[Char]</code> over the characters from 'x' to 'y'
+ def until(limit: Char): NumericRange[Char] =
+ new NumericRange.Exclusive(x, limit, 1.toChar)
+
+ /** Create a <code>IndexedSeqView[Char]</code> over the characters from 'x' to 'limit'
*/
- def to(y: Char): IndexedSeqView[Char, IndexedSeq[Char]] = until((y + 1).toChar)
+ def to(limit: Char): NumericRange[Char] =
+ new NumericRange.Inclusive(x, limit, 1.toChar)
}
diff --git a/src/library/scala/runtime/ScalaRunTime.scala b/src/library/scala/runtime/ScalaRunTime.scala
index 2f6ffb5535..dffebfc892 100644
--- a/src/library/scala/runtime/ScalaRunTime.scala
+++ b/src/library/scala/runtime/ScalaRunTime.scala
@@ -12,10 +12,11 @@
package scala.runtime
import scala.reflect.ClassManifest
-import scala.collection.Seq
-import scala.collection.mutable._
+import scala.collection.{ Seq, IndexedSeq, TraversableView }
+import scala.collection.mutable.WrappedArray
import scala.collection.immutable.{ List, Stream, Nil, :: }
-import scala.util.control.ControlException
+import scala.xml.{ Node, MetaData }
+import scala.util.control.ControlThrowable
/* The object <code>ScalaRunTime</code> provides ...
*/
@@ -89,7 +90,7 @@ object ScalaRunTime {
}
/** Convert a numeric value array to an object array.
- * Needed to deal with vararg arguments of primtive types that are passed
+ * Needed to deal with vararg arguments of primitive types that are passed
* to a generic Java vararg parameter T ...
*/
def toObjectArray(src: AnyRef): Array[Object] = {
@@ -123,7 +124,7 @@ object ScalaRunTime {
private var exception: Throwable =
try { run() ; null }
catch {
- case e: ControlException => throw e // don't catch non-local returns etc
+ case e: ControlThrowable => throw e // don't catch non-local returns etc
case e: Throwable => e
}
@@ -165,7 +166,8 @@ object ScalaRunTime {
@inline def inlinedEquals(x: Object, y: Object): Boolean =
if (x eq y) true
else if (x eq null) false
- else if (x.isInstanceOf[java.lang.Number] || x.isInstanceOf[java.lang.Character]) BoxesRunTime.equals2(x, y)
+ else if (x.isInstanceOf[java.lang.Number]) BoxesRunTime.equalsNumObject(x.asInstanceOf[java.lang.Number], y)
+ else if (x.isInstanceOf[java.lang.Character]) BoxesRunTime.equalsCharObject(x.asInstanceOf[java.lang.Character], y)
else x.equals(y)
def _equals(x: Product, y: Any): Boolean = y match {
@@ -173,6 +175,50 @@ object ScalaRunTime {
case _ => false
}
+ // hashcode -----------------------------------------------------------
+
+ @inline def hash(x: Any): Int =
+ if (x.isInstanceOf[java.lang.Number]) BoxesRunTime.hashFromNumber(x.asInstanceOf[java.lang.Number])
+ else x.hashCode
+
+ @inline def hash(dv: Double): Int = {
+ val iv = dv.toInt
+ if (iv == dv) return iv
+
+ val lv = dv.toLong
+ if (lv == dv) return lv.hashCode
+ else dv.hashCode
+ }
+ @inline def hash(fv: Float): Int = {
+ val iv = fv.toInt
+ if (iv == fv) return iv
+
+ val lv = fv.toLong
+ if (lv == fv) return lv.hashCode
+ else fv.hashCode
+ }
+ @inline def hash(lv: Long): Int = {
+ val iv = lv.toInt
+ if (iv == lv) iv else lv.hashCode
+ }
+ @inline def hash(x: Int): Int = x
+ @inline def hash(x: Short): Int = x.toInt
+ @inline def hash(x: Byte): Int = x.toInt
+ @inline def hash(x: Char): Int = x.toInt
+
+ @inline def hash(x: Number): Int = runtime.BoxesRunTime.hashFromNumber(x)
+ @inline def hash(x: java.lang.Long): Int = {
+ val iv = x.intValue
+ if (iv == x.longValue) iv else x.hashCode
+ }
+
+ /** A helper method for constructing case class equality methods,
+ * because existential types get in the way of a clean outcome and
+ * it's performing a series of Any/Any equals comparisons anyway.
+ * See ticket #2867 for specifics.
+ */
+ def sameElements(xs1: Seq[Any], xs2: Seq[Any]) = xs1 sameElements xs2
+
/** Given any Scala value, convert it to a String.
*
* The primary motivation for this method is to provide a means for
@@ -186,12 +232,26 @@ object ScalaRunTime {
* @return a string representation of <code>arg</code>
*
*/
- def stringOf(arg : Any): String = arg match {
- case null => "null"
- case arg: AnyRef if isArray(arg) =>
- val d: collection.IndexedSeq[Any] = WrappedArray.make(arg).deep
- d.toString
- case arg: WrappedArray[_] => arg.deep.toString
- case arg => arg.toString
+ def stringOf(arg: Any): String = {
+ def inner(arg: Any): String = arg match {
+ case null => "null"
+ // Node extends NodeSeq extends Seq[Node] strikes again
+ case x: Node => x toString
+ // Not to mention MetaData extends Iterable[MetaData]
+ case x: MetaData => x toString
+ case x: AnyRef if isArray(x) => WrappedArray make x map inner mkString ("Array(", ", ", ")")
+ case x: TraversableView[_, _] => x.toString
+ case x: Traversable[_] if !x.hasDefiniteSize => x.toString
+ case x: Traversable[_] =>
+ // Some subclasses of AbstractFile implement Iterable, then throw an
+ // exception if you call iterator. What a world.
+ // And they can't be infinite either.
+ if (x.getClass.getName startsWith "scala.tools.nsc.io") x.toString
+ else (x map inner) mkString (x.stringPrefix + "(", ", ", ")")
+ case x => x toString
+ }
+ val s = inner(arg)
+ val nl = if (s contains "\n") "\n" else ""
+ nl + s + "\n"
}
}
diff --git a/src/library/scala/testing/SUnit.scala b/src/library/scala/testing/SUnit.scala
index cf43bd1b06..d5d845cb98 100644
--- a/src/library/scala/testing/SUnit.scala
+++ b/src/library/scala/testing/SUnit.scala
@@ -12,6 +12,7 @@
package scala.testing
import scala.collection.mutable.ArrayBuffer
+import xml.{ Node, NodeSeq }
/**
* <p>
@@ -237,6 +238,25 @@ object SUnit {
assertTrue("(no message)", actual)
}
+ /** Temporary patchwork trying to nurse xml forward. */
+ def assertEqualsXML(msg: String, expected: NodeSeq, actual: NodeSeq) {
+ if (!expected.xml_==(actual))
+ fail(msg, expected, actual)
+ }
+ def assertEqualsXML(msg: String, expected: Seq[Node], actual: Seq[Node]) {
+ assertEqualsXML(msg, expected: NodeSeq, actual: NodeSeq)
+ }
+
+ def assertEqualsXML(expected: NodeSeq, actual: NodeSeq) {
+ assertEqualsXML("(no message)", expected, actual)
+ }
+
+ def assertSameElementsXML(actual: Seq[Node], expected: Seq[Node]) {
+ val res = (actual: NodeSeq) xml_sameElements expected
+
+ assert(res, "\nassertSameElementsXML:\n actual = %s\n expected = %s".format(actual, expected))
+ }
+
/** throws <code>AssertFailed</code> with given message <code>msg</code>.
*/
def fail(msg: String) {
diff --git a/src/library/scala/throws.scala b/src/library/scala/throws.scala
index 500db0a30a..b932ccc7ac 100644
--- a/src/library/scala/throws.scala
+++ b/src/library/scala/throws.scala
@@ -14,13 +14,13 @@ package scala
/** <p>
* Annotation for specifying the exceptions thrown by a method.
* For example:
- * </p><pre>
- * <b>class</b> Reader(fname: String) {
- * <b>private val</b> in =
- * <b>new</b> BufferedReader(<b>new</b> <a class="java/io/FileReader" href="" target="_top">FileReader</a>(fname))
- * @throws(classOf[<a class="java/io/IOException" href="" target="_top">IOException</a>])
- * <b>def</b> read() = in.read()
- * }</pre>
+ * {{{
+ * class Reader(fname: String) {
+ * private val in = new BufferedReader(new FileReader(fname))
+ * @throws(classOf[IOException])
+ * def read() = in.read()
+ * }
+ * }}}
*
* @author Nikolay Mihaylov
* @version 1.0, 19/05/2006
diff --git a/src/library/scala/util/Properties.scala b/src/library/scala/util/Properties.scala
index 73b5cf855a..b781e46be5 100644
--- a/src/library/scala/util/Properties.scala
+++ b/src/library/scala/util/Properties.scala
@@ -8,12 +8,18 @@
// $Id$
-
package scala.util
+import java.io.{ IOException, PrintWriter }
+
+/** Loads library.properties from the jar. */
+object Properties extends PropertiesTrait {
+ protected def propCategory = "library"
+ protected def pickJarBasedOn = classOf[ScalaObject]
+}
+
private[scala] trait PropertiesTrait
{
- import java.io.{ IOException, PrintWriter }
protected def propCategory: String // specializes the remainder of the values
protected def pickJarBasedOn: Class[_] // props file comes from jar containing this
@@ -21,7 +27,7 @@ private[scala] trait PropertiesTrait
protected val propFilename = "/" + propCategory + ".properties"
/** The loaded properties */
- protected lazy val props: java.util.Properties = {
+ protected lazy val scalaProps: java.util.Properties = {
val props = new java.util.Properties
val stream = pickJarBasedOn getResourceAsStream propFilename
if (stream ne null)
@@ -30,7 +36,6 @@ private[scala] trait PropertiesTrait
props
}
- protected def onull[T <: AnyRef](x: T) = if (x eq null) None else Some(x)
private def quietlyDispose(action: => Unit, disposal: => Unit) =
try { action }
finally {
@@ -38,51 +43,85 @@ private[scala] trait PropertiesTrait
catch { case _: IOException => }
}
- // for values based on system properties
- def sysprop(name: String): String = sysprop(name, "")
- def sysprop(name: String, default: String): String = System.getProperty(name, default)
- def syspropset(name: String, value: String) = System.setProperty(name, value)
+ def propIsSet(name: String) = System.getProperty(name) != null
+ def propIsSetTo(name: String, value: String) = propOrNull(name) == value
+ def propOrElse(name: String, alt: String) = System.getProperty(name, alt)
+ def propOrEmpty(name: String) = propOrElse(name, "")
+ def propOrNull(name: String) = propOrElse(name, null)
+ def propOrNone(name: String) = Option(propOrNull(name))
+ def propOrFalse(name: String) = propOrNone(name) exists (x => List("yes", "on", "true") contains x.toLowerCase)
+ def setProp(name: String, value: String) = System.setProperty(name, value)
+ def clearProp(name: String) = System.clearProperty(name)
+
+ def envOrElse(name: String, alt: String) = Option(System getenv name) getOrElse alt
// for values based on propFilename
- def prop(name: String): String = props.getProperty(name, "")
- def prop(name: String, default: String): String = props.getProperty(name, default)
+ def scalaPropOrElse(name: String, alt: String): String = scalaProps.getProperty(name, alt)
+ def scalaPropOrEmpty(name: String): String = scalaPropOrElse(name, "")
/** The version number of the jar this was loaded from plus "version " prefix,
* or "version (unknown)" if it cannot be determined.
*/
- val versionString = "version " + prop("version.number", "(unknown)")
- val copyrightString = prop("copyright.string", "(c) 2002-2010 LAMP/EPFL")
+ val versionString = "version " + scalaPropOrElse("version.number", "(unknown)")
+ val copyrightString = scalaPropOrElse("copyright.string", "(c) 2002-2010 LAMP/EPFL")
/** This is the encoding to use reading in source files, overridden with -encoding
* Note that it uses "prop" i.e. looks in the scala jar, not the system properties.
*/
- val sourceEncoding = prop("file.encoding", "UTF8")
+ def sourceEncoding = scalaPropOrElse("file.encoding", "UTF-8")
/** This is the default text encoding, overridden (unreliably) with
* JAVA_OPTS="-Dfile.encoding=Foo"
*/
- val encodingString = sysprop("file.encoding", "UTF8")
-
- val isWin = sysprop("os.name") startsWith "Windows"
- val isMac = sysprop("java.vendor") startsWith "Apple"
- val javaClassPath = sysprop("java.class.path")
- val javaHome = sysprop("java.home")
- val javaVmName = sysprop("java.vm.name")
- val javaVmVersion = sysprop("java.vm.version")
- val javaVmInfo = sysprop("java.vm.info")
- val javaVersion = sysprop("java.version")
- val tmpDir = sysprop("java.io.tmpdir")
- val userName = sysprop("user.name")
- val scalaHome = sysprop("scala.home", null) // XXX places do null checks...
+ def encodingString = propOrElse("file.encoding", "UTF-8")
+
+ /** The default end of line character.
+ */
+ def lineSeparator = propOrElse("line.separator", "\n")
+
+ /** Various well-known properties.
+ */
+ def javaClassPath = propOrEmpty("java.class.path")
+ def javaHome = propOrEmpty("java.home")
+ def javaVendor = propOrEmpty("java.vendor")
+ def javaVersion = propOrEmpty("java.version")
+ def javaVmInfo = propOrEmpty("java.vm.info")
+ def javaVmName = propOrEmpty("java.vm.name")
+ def javaVmVendor = propOrEmpty("java.vm.vendor")
+ def javaVmVersion = propOrEmpty("java.vm.version")
+ def osName = propOrEmpty("os.name")
+ def scalaHome = propOrEmpty("scala.home")
+ def tmpDir = propOrEmpty("java.io.tmpdir")
+ def userDir = propOrEmpty("user.dir")
+ def userHome = propOrEmpty("user.home")
+ def userName = propOrEmpty("user.name")
+
+ /** Some derived values.
+ */
+ def isWin = osName startsWith "Windows"
+ def isMac = javaVendor startsWith "Apple"
- // provide a main method so version info can be obtained by running this
- private val writer = new java.io.PrintWriter(Console.err, true)
def versionMsg = "Scala %s %s -- %s".format(propCategory, versionString, copyrightString)
- def main(args: Array[String]) { writer println versionMsg }
-}
+ def scalaCmd = if (isWin) "scala.bat" else "scala"
+ def scalacCmd = if (isWin) "scalac.bat" else "scalac"
-/** Loads library.properties from the jar. */
-object Properties extends PropertiesTrait {
- protected def propCategory = "library"
- protected def pickJarBasedOn = classOf[Application]
+ /** Can the java version be determined to be at least as high as the argument?
+ * Hard to properly future proof this but at the rate 1.7 is going we can leave
+ * the issue for our cyborg grandchildren to solve.
+ */
+ def isJavaAtLeast(version: String) = {
+ val okVersions = version match {
+ case "1.5" => List("1.5", "1.6", "1.7")
+ case "1.6" => List("1.6", "1.7")
+ case "1.7" => List("1.7")
+ case _ => Nil
+ }
+ okVersions exists (javaVersion startsWith _)
+ }
+
+ // provide a main method so version info can be obtained by running this
+ def main(args: Array[String]) {
+ val writer = new PrintWriter(Console.err, true)
+ writer println versionMsg
+ }
}
diff --git a/src/library/scala/util/Random.scala b/src/library/scala/util/Random.scala
index 3baa7e33e3..ffa248d638 100644
--- a/src/library/scala/util/Random.scala
+++ b/src/library/scala/util/Random.scala
@@ -107,19 +107,17 @@ class Random(val self: java.util.Random) {
*
* @since 2.8
*/
-object Random extends Random
-{
- import collection.Traversable
+object Random extends Random {
import collection.mutable.ArrayBuffer
import collection.generic.CanBuildFrom
/** Returns a new collection of the same type in a randomly chosen order.
*
- * @param coll the Traversable to shuffle
- * @return the shuffled Traversable
+ * @param coll the TraversableOnce to shuffle
+ * @return the shuffled TraversableOnce
*/
- def shuffle[T, CC[X] <: Traversable[X]](coll: CC[T])(implicit bf: CanBuildFrom[CC[T], T, CC[T]]): CC[T] = {
- val buf = new ArrayBuffer[T] ++= coll
+ def shuffle[T, CC[X] <: TraversableOnce[X]](xs: CC[T])(implicit bf: CanBuildFrom[CC[T], T, CC[T]]): CC[T] = {
+ val buf = new ArrayBuffer[T] ++= xs
def swap(i1: Int, i2: Int) {
val tmp = buf(i1)
@@ -132,6 +130,6 @@ object Random extends Random
swap(n - 1, k)
}
- bf(coll) ++= buf result
+ bf(xs) ++= buf result
}
}
diff --git a/src/library/scala/util/Sorting.scala b/src/library/scala/util/Sorting.scala
index 73228b53d5..4189f2d912 100644
--- a/src/library/scala/util/Sorting.scala
+++ b/src/library/scala/util/Sorting.scala
@@ -12,46 +12,43 @@
package scala.util
import scala.reflect.ClassManifest
-/** <p>
- * The Sorting object provides functions that can sort various kinds of
- * objects. You can provide a comparison function, or you can request a sort
- * of items that are viewable as <code>Ordered</code>. Some sorts that
- * operate directly on a subset of value types are also provided. These
- * implementations are derived from those in the Sun JDK.
- * </p>
- * <p>
- * Note that stability doesn't matter for value types, so use the quickSort
- * variants for those. <code>stableSort</code> is intended to be used with
- * objects when the prior ordering should be preserved, where possible.
- * </p>
- *
- * @author Ross Judson
- * @version 1.0
- */
+/** The Sorting object provides functions that can sort various kinds of
+ * objects. You can provide a comparison function, or you can request a sort
+ * of items that are viewable as <code>Ordered</code>. Some sorts that
+ * operate directly on a subset of value types are also provided. These
+ * implementations are derived from those in the Sun JDK.
+ *
+ * Note that stability doesn't matter for value types, so use the quickSort
+ * variants for those. <code>stableSort</code> is intended to be used with
+ * objects when the prior ordering should be preserved, where possible.
+ *
+ * @author Ross Judson
+ * @version 1.0
+ */
object Sorting {
/** Provides implicit access to sorting on arbitrary sequences of orderable
* items. This doesn't quite work the way that I want yet -- K should be
* bounded as viewable, but the compiler rejects that.
*/
- implicit def seq2RichSort[K <: Ordered[K] : ClassManifest](s: Seq[K]) = new RichSorting[K](s)
+ // implicit def seq2RichSort[K <: Ordered[K] : ClassManifest](s: Seq[K]) = new RichSorting[K](s)
/** Quickly sort an array of Doubles. */
- def quickSort(a: Array[Double]) = sort1(a, 0, a.length)
+ def quickSort(a: Array[Double]) { sort1(a, 0, a.length) }
- /** Quickly sort an array of items that are viewable as ordered. */
- def quickSort[K <% Ordered[K]](a: Array[K]) = sort1(a, 0, a.length)
+ /** Quickly sort an array of items with an implicit Ordering. */
+ def quickSort[K](a: Array[K])(implicit ord: Ordering[K]) { sort1(a, 0, a.length) }
/** Quickly sort an array of Ints. */
- def quickSort(a: Array[Int]) = sort1(a, 0, a.length)
+ def quickSort(a: Array[Int]) { sort1(a, 0, a.length) }
/** Quickly sort an array of Floats. */
- def quickSort(a: Array[Float]) = sort1(a, 0, a.length)
+ def quickSort(a: Array[Float]) { sort1(a, 0, a.length) }
/** Sort an array of K where K is Ordered, preserving the existing order
- where the values are equal. */
- def stableSort[K <% Ordered[K] : ClassManifest](a: Array[K]) {
- stableSort(a, 0, a.length-1, new Array[K](a.length), (a:K, b:K) => a < b)
+ * where the values are equal. */
+ def stableSort[K](a: Array[K])(implicit m: ClassManifest[K], ord: Ordering[K]) {
+ stableSort(a, 0, a.length-1, new Array[K](a.length), ord.lt _)
}
/** Sorts an array of <code>K</code> given an ordering function
@@ -77,8 +74,8 @@ object Sorting {
}
/** Sorts an arbitrary sequence of items that are viewable as ordered. */
- def stableSort[K <% Ordered[K] : ClassManifest](a: Seq[K]): Array[K] =
- stableSort(a, (a:K, b:K) => a < b)
+ def stableSort[K](a: Seq[K])(implicit m: ClassManifest[K], ord: Ordering[K]): Array[K] =
+ stableSort(a, ord.lt _)
/** Stably sorts a sequence of items given an extraction function that will
* return an ordered key from an item.
@@ -87,10 +84,11 @@ object Sorting {
* @param f the comparison function.
* @return the sorted sequence of items.
*/
- def stableSort[K : ClassManifest, M <% Ordered[M]](a: Seq[K], f: K => M): Array[K] =
- stableSort(a, (a: K, b: K) => f(a) < f(b))
+ def stableSort[K, M](a: Seq[K], f: K => M)(implicit m: ClassManifest[K], ord: Ordering[M]): Array[K] =
+ stableSort(a)(m, ord on f)
- private def sort1[K <% Ordered[K]](x: Array[K], off: Int, len: Int) {
+ private def sort1[K](x: Array[K], off: Int, len: Int)(implicit ord: Ordering[K]) {
+ import ord._
def swap(a: Int, b: Int) {
val t = x(a)
x(a) = x(b)
@@ -532,51 +530,6 @@ object Sorting {
}
}
}
-
- // for testing
- def main(args: Array[String]) {
- val tuples = Array(
- (1, "one"), (1, "un"), (3, "three"), (2, "deux"),
- (2, "two"), (0, "zero"), (3, "trois")
- )
- val integers = Array(
- 3, 4, 0, 4, 5, 0, 3, 3, 0
- )
- val doubles = Array(
- 3.4054752250314283E9,
- 4.9663151227666664E10,
-// 0.0/0.0 is interpreted as Nan
-// 0.0/0.0,
- 4.9663171987125E10,
- 5.785996973446602E9,
-// 0.0/0.0,
- 3.973064849653333E10,
- 3.724737288678125E10
-// 0.0/0.0
- )
- val floats = Array(
- 3.4054752250314283E9f,
- 4.9663151227666664E10f,
-// 0.0f/0.0f,
- 4.9663171987125E10f,
- 5.785996973446602E9f,
-// 0.0f/0.0f,
- 3.973064849653333E10f,
- 3.724737288678125E10f
-// 0.0f/0.0f
- )
- Sorting quickSort tuples
- println(tuples.toList)
-
- Sorting quickSort integers
- println(integers.toList)
-
- Sorting quickSort doubles
- println(doubles.toList)
-
- Sorting quickSort floats
- println(floats.toList)
- }
}
/** <p>
@@ -585,8 +538,7 @@ object Sorting {
* the items are ordered.
* </p>
*/
-class RichSorting[K <: Ordered[K] : ClassManifest](s: Seq[K]) {
-
+class RichSorting[K](s: Seq[K])(implicit m: ClassManifest[K], ord: Ordering[K]) {
/** Returns an array with a sorted copy of the RichSorting's sequence.
*/
def sort = Sorting.stableSort(s)
diff --git a/src/library/scala/util/automata/SubsetConstruction.scala b/src/library/scala/util/automata/SubsetConstruction.scala
index 0ebdd160e7..c8fba39f0e 100644
--- a/src/library/scala/util/automata/SubsetConstruction.scala
+++ b/src/library/scala/util/automata/SubsetConstruction.scala
@@ -57,7 +57,7 @@ class SubsetConstruction[T <: AnyRef](val nfa: NondetWordAutom[T]) {
invIndexMap = invIndexMap.updated(ix, P)
ix += 1
- // make transitiion map
+ // make transition map
val Pdelta = new mutable.HashMap[T, BitSet]
delta.update(P, Pdelta)
diff --git a/src/library/scala/util/automata/WordBerrySethi.scala b/src/library/scala/util/automata/WordBerrySethi.scala
index d3238b6f67..b54fdc53f2 100644
--- a/src/library/scala/util/automata/WordBerrySethi.scala
+++ b/src/library/scala/util/automata/WordBerrySethi.scala
@@ -81,7 +81,7 @@ abstract class WordBerrySethi extends BaseBerrySethi {
this.labels += label
}
- // overriden in BindingBerrySethi
+ // overridden in BindingBerrySethi
protected def seenLabel(r: RegExp, label: lang._labelT): Int = {
pos += 1
seenLabel(r, pos, label)
diff --git a/src/library/scala/util/control/Breaks.scala b/src/library/scala/util/control/Breaks.scala
index 7ae4cba63a..1f06f04418 100644
--- a/src/library/scala/util/control/Breaks.scala
+++ b/src/library/scala/util/control/Breaks.scala
@@ -28,14 +28,14 @@ package scala.util.control
*/
class Breaks {
- private val breakException = new BreakException
+ private val breakException = new BreakControl
/** A block from which one can exit with a `break''. */
def breakable(op: => Unit) {
try {
op
} catch {
- case ex: BreakException =>
+ case ex: BreakControl =>
if (ex ne breakException) throw ex
}
}
@@ -61,5 +61,5 @@ class Breaks {
*/
object Breaks extends Breaks
-private class BreakException extends RuntimeException with ControlException
+private class BreakControl extends ControlThrowable
diff --git a/src/library/scala/util/control/ControlException.scala b/src/library/scala/util/control/ControlThrowable.scala
index 73f2b31e89..090bec4e98 100644
--- a/src/library/scala/util/control/ControlException.scala
+++ b/src/library/scala/util/control/ControlThrowable.scala
@@ -21,19 +21,19 @@ package scala.util.control
*
* <p>Instances of <code>Throwable</code> subclasses marked in
* this way should not normally be caught. Where catch-all behaviour is
- * required <code>ControlException</code>s should be propagated, for
+ * required <code>ControlThrowable</code>s should be propagated, for
* example,</p>
*
* <pre>
- * import scala.util.control.ControlException
+ * import scala.util.control.ControlThrowable
*
* try {
* // Body might throw arbitrarily
* } catch {
- * case ce : ControlException => throw ce // propagate
+ * case ce : ControlThrowable => throw ce // propagate
* case t : Exception => log(t) // log and suppress
* </pre>
*
* @author Miles Sabin
*/
-trait ControlException extends Throwable with NoStackTrace
+trait ControlThrowable extends Throwable with NoStackTrace
diff --git a/src/library/scala/util/logging/ConsoleLogger.scala b/src/library/scala/util/logging/ConsoleLogger.scala
index d4ef268e37..ec4148abc9 100644
--- a/src/library/scala/util/logging/ConsoleLogger.scala
+++ b/src/library/scala/util/logging/ConsoleLogger.scala
@@ -21,8 +21,6 @@ package scala.util.logging
trait ConsoleLogger extends Logged {
/** logs argument to Console using <code>Console.println</code>
- *
- * @param msg ...
*/
override def log(msg: String): Unit = Console.println(msg)
}
diff --git a/src/library/scala/util/matching/Regex.scala b/src/library/scala/util/matching/Regex.scala
index 2ceef4563c..1384dfa47c 100644
--- a/src/library/scala/util/matching/Regex.scala
+++ b/src/library/scala/util/matching/Regex.scala
@@ -107,6 +107,32 @@ class Regex(regex: String, groupNames: String*) {
m.replaceAll(replacement)
}
+ /**
+ * Replaces all matches using a replacer function.
+ *
+ * @param target The string to match.
+ * @param replacer The function which maps a match to another string.
+ * @return The target string after replacements.
+ */
+ def replaceAllIn(target: java.lang.CharSequence, replacer: Match => String): String = {
+ val it = new Regex.MatchIterator(target, this, groupNames).replacementData
+ while (it.hasNext) {
+ val matchdata = it.next
+ it.replace(replacer(matchdata))
+ }
+ it.replaced
+ }
+
+ def replaceSomeIn(target: java.lang.CharSequence, replacer: Match => Option[String]): String = {
+ val it = new Regex.MatchIterator(target, this, groupNames).replacementData
+ while (it.hasNext) {
+ val matchdata = it.next
+ val replaceopt = replacer(matchdata)
+ if (replaceopt != None) it.replace(replaceopt.get)
+ }
+ it.replaced
+ }
+
/** Replaces the first match by a string.
*
* @param target The string to match
@@ -227,7 +253,7 @@ object Regex {
}
- /** A case class for a succesful match.
+ /** A case class for a successful match.
*/
class Match(val source: java.lang.CharSequence,
matcher: Matcher,
@@ -264,12 +290,17 @@ object Regex {
def unapply(m: Match): Some[String] = Some(m.matched)
}
+ /** An extractor object that yields groups in the match. */
+ object Groups {
+ def unapplySeq(m: Match): Option[Seq[String]] = if (m.groupCount > 0) Some(1 to m.groupCount map m.group) else None
+ }
+
/** A class to step through a sequence of regex matches
*/
class MatchIterator(val source: java.lang.CharSequence, val regex: Regex, val groupNames: Seq[String])
extends Iterator[String] with MatchData { self =>
- private val matcher = regex.pattern.matcher(source)
+ protected val matcher = regex.pattern.matcher(source)
private var nextSeen = false
/** Is there another match? */
@@ -307,6 +338,31 @@ object Regex {
def hasNext = self.hasNext
def next = { self.next; new Match(source, matcher, groupNames).force }
}
+
+ /** Convert to an iterator that yields MatchData elements instead of Strings and has replacement support */
+ private[matching] def replacementData = new Iterator[Match] with Replacement {
+ def matcher = self.matcher
+ def hasNext = self.hasNext
+ def next = { self.next; new Match(source, matcher, groupNames).force }
+ }
+ }
+
+ /**
+ * A trait able to build a string with replacements assuming it has a matcher.
+ * Meant to be mixed in with iterators.
+ */
+ private[matching] trait Replacement {
+ protected def matcher: Matcher
+
+ private var sb = new java.lang.StringBuffer
+
+ def replaced = {
+ val newsb = new java.lang.StringBuffer(sb)
+ matcher.appendTail(newsb)
+ newsb.toString
+ }
+
+ def replace(rs: String) = matcher.appendReplacement(sb, rs)
}
}
diff --git a/src/library/scala/util/parsing/ast/Binders.scala b/src/library/scala/util/parsing/ast/Binders.scala
index b397e900da..7a9b8e5dcd 100644
--- a/src/library/scala/util/parsing/ast/Binders.scala
+++ b/src/library/scala/util/parsing/ast/Binders.scala
@@ -127,10 +127,8 @@ trait Binders extends AbstractSyntax with Mappable {
* (e.g. the variable name in a local variable declaration)
*
* @param b a new binder that is distinct from the existing binders in this scope,
- * and shares their conceptual scope
- * @pre canAddBinder(b)
- * @post binds(b)
- * @post getElementFor(b) eq b
+ * and shares their conceptual scope. canAddBinder(b)` must hold.`
+ * @return `binds(b)` and `getElementFor(b) eq b` will hold.
*/
def addBinder(b: binderType) { substitution += Pair(b, b) }
@@ -140,7 +138,7 @@ trait Binders extends AbstractSyntax with Mappable {
* linked to its `UnderBinder' (i.e., while parsing, BoundElements may be added to the Scope
* associated to the UnderBinder, but after that, no changes are allowed, except for substitution)?
*
- * @returns true if `b' had not been added yet
+ * @return true if `b' had not been added yet
*/
def canAddBinder(b: binderType): Boolean = !binds(b)
@@ -150,17 +148,15 @@ trait Binders extends AbstractSyntax with Mappable {
* a proxy for the element it is bound to by its binder, `substitute' may thus be thought of
* as replacing all the bound occurrences of the given binder `b' by their new value `value'.
*
- * @param b the binder whose bound occurrences should be given a new value
+ * @param b the binder whose bound occurrences should be given a new value. `binds(b)` must hold.
* @param value the new value for the bound occurrences of `b'
- * @pre binds(b)
- * @post getElementFor(b) eq value
+ * @return `getElementFor(b) eq value` will hold.
*/
def substitute(b: binderType, value: Element): Unit = substitution(b) = value
/** Returns the current value for the bound occurrences of `b'.
*
- * @param b the contained binder whose current value should be returned
- * @pre binds(b)
+ * @param b the contained binder whose current value should be returned `binds(b)` must hold.
*/
def getElementFor(b: binderType): Element = substitution(b)
@@ -173,7 +169,7 @@ trait Binders extends AbstractSyntax with Mappable {
def allowForwardRef: Scope[binderType] = this // TODO
/** Return a nested scope -- binders entered into it won't be visible in this scope, but if this scope allows forward references,
- the binding in the returned scope also does, and thus the check that all variables are bound is deferred until this scope is left **/
+ * the binding in the returned scope also does, and thus the check that all variables are bound is deferred until this scope is left **/
def nested: Scope[binderType] = this // TODO
def onEnter {}
@@ -193,7 +189,7 @@ trait Binders extends AbstractSyntax with Mappable {
* A `BoundElement' is represented textually by its bound element, followed by its scope's `id'.
* For example: `x@1' represents the variable `x' that is bound in the scope with `id' `1'.
*
- * @invar scope.binds(el)
+ * @note `scope.binds(el)` holds before and after.
*/
case class BoundElement[boundElement <: NameElement](el: boundElement, scope: Scope[boundElement]) extends NameElement with Proxy with BindingSensitive {
/** Returns the element this `BoundElement' stands for.
@@ -300,7 +296,7 @@ trait Binders extends AbstractSyntax with Mappable {
*
* The name `sequence' comes from the fact that this method's type is equal to the type of monadic sequence.
*
- * @pre !orig.isEmpty implies orig.forall(ub => ub.scope eq orig(0).scope)
+ * @note `!orig.isEmpty` implies `orig.forall(ub => ub.scope eq orig(0).scope)`
*
*/
def sequence[bt <: NameElement, st <% Mappable[st]](orig: List[UnderBinder[bt, st]]): UnderBinder[bt, List[st]] =
diff --git a/src/library/scala/util/parsing/combinator/Parsers.scala b/src/library/scala/util/parsing/combinator/Parsers.scala
index 6fe35ad3b0..d270757189 100644
--- a/src/library/scala/util/parsing/combinator/Parsers.scala
+++ b/src/library/scala/util/parsing/combinator/Parsers.scala
@@ -48,35 +48,21 @@ import scala.annotation.tailrec
* of the input.
* </p>
*
- * @requires Elem the type of elements the provided parsers consume
- * (When consuming invidual characters, a parser is typically called a ``scanner'',
- * which produces ``tokens'' that are consumed by what is normally called a ``parser''.
- * Nonetheless, the same principles apply, regardless of the input type.)</p>
- *<p>
- * @provides Input = Reader[Elem]
- * The type of input the parsers in this component expect.</p>
- *<p>
- * @provides Parser[+T] extends (Input => ParseResult[T])
- * Essentially, a `Parser[T]' is a function from `Input' to `ParseResult[T]'.</p>
- *<p>
- * @provides ParseResult[+T] is like an `Option[T]', in the sense that it is either
- * `Success[T]', which consists of some result (:T) (and the rest of the input) or
- * `Failure[T]', which provides an error message (and the rest of the input).</p>
- *
* @author Martin Odersky, Iulian Dragos, Adriaan Moors
*/
trait Parsers {
- /** the type of input elements */
+ /** the type of input elements the provided parsers consume (When consuming invidual characters, a parser is typically
+ * called a ``scanner'', which produces ``tokens'' that are consumed by what is normally called a ``parser''.
+ * Nonetheless, the same principles apply, regardless of the input type.) */
type Elem
- /** The parser input is an abstract reader of input elements */
+ /** The parser input is an abstract reader of input elements, i.e. the type of input the parsers in this component
+ * expect. */
type Input = Reader[Elem]
- /** A base class for parser results.
- * A result is either successful or not (failure may be fatal, i.e.,
- * an Error, or not, i.e., a Failure)
- * On success, provides a result of type <code>T</code>.
- */
+ /** A base class for parser results. A result is either successful or not (failure may be fatal, i.e., an Error, or
+ * not, i.e., a Failure). On success, provides a result of type `T` which consists of some result (and the rest of
+ * the input). */
sealed abstract class ParseResult[+T] {
/** Functional composition of ParseResults
*
@@ -302,7 +288,7 @@ trait Parsers {
* characters accepts.</p>
*
* @param q a parser that accepts if p consumes less characters.
- * @return a `Parser' that returns the result of the parser consuming the most characteres (out of `p' and `q').
+ * @return a `Parser' that returns the result of the parser consuming the most characters (out of `p' and `q').
*/
def ||| [U >: T](q: => Parser[U]): Parser[U] = new Parser[U] {
def apply(in: Input) = {
@@ -362,7 +348,7 @@ trait Parsers {
def ^? [U](f: PartialFunction[T, U]): Parser[U] = ^?(f, r => "Constructor function not defined at "+r)
- /** A parser combinator that parameterises a subsequent parser with the result of this one
+ /** A parser combinator that parameterizes a subsequent parser with the result of this one
*
*<p>
* Use this combinator when a parser depends on the result of a previous parser. `p' should be
@@ -592,13 +578,18 @@ trait Parsers {
def rep1[T](first: => Parser[T], p: => Parser[T]): Parser[List[T]] = Parser { in =>
val elems = new ListBuffer[T]
- @tailrec def applyp(in0: Input): ParseResult[List[T]] = p(in0) match {
- case Success(x, rest) => elems += x ; applyp(rest)
- case _ => Success(elems.toList, in0)
+ def continue(in: Input): ParseResult[List[T]] = {
+ val p0 = p // avoid repeatedly re-evaluating by-name parser
+ @tailrec def applyp(in0: Input): ParseResult[List[T]] = p0(in0) match {
+ case Success(x, rest) => elems += x ; applyp(rest)
+ case _ => Success(elems.toList, in0)
+ }
+
+ applyp(in)
}
first(in) match {
- case Success(x, rest) => elems += x ; applyp(rest)
+ case Success(x, rest) => elems += x ; continue(rest)
case ns: NoSuccess => ns
}
}
@@ -616,10 +607,11 @@ trait Parsers {
def repN[T](num: Int, p: => Parser[T]): Parser[List[T]] =
if (num == 0) success(Nil) else Parser { in =>
val elems = new ListBuffer[T]
+ val p0 = p // avoid repeatedly re-evaluating by-name parser
@tailrec def applyp(in0: Input): ParseResult[List[T]] =
if (elems.length == num) Success(elems.toList, in0)
- else p(in0) match {
+ else p0(in0) match {
case Success(x, rest) => elems += x ; applyp(rest)
case ns: NoSuccess => return ns
}
@@ -670,7 +662,7 @@ trait Parsers {
/** A parser generator that generalises the rep1sep generator so that `q', which parses the separator,
* produces a right-associative function that combines the elements it separates. Additionally,
- * The right-most (last) element and the left-most combinating function have to be supplied.
+ * The right-most (last) element and the left-most combining function have to be supplied.
*
* rep1sep(p: Parser[T], q) corresponds to chainr1(p, q ^^ cons, cons, Nil) (where val cons = (x: T, y: List[T]) => x :: y)
*
diff --git a/src/library/scala/util/parsing/combinator/lexical/Lexical.scala b/src/library/scala/util/parsing/combinator/lexical/Lexical.scala
index fc3100053a..7a35bcad7d 100644
--- a/src/library/scala/util/parsing/combinator/lexical/Lexical.scala
+++ b/src/library/scala/util/parsing/combinator/lexical/Lexical.scala
@@ -9,11 +9,12 @@
// $Id$
-package scala.util.parsing.combinator.lexical
-import scala.util.parsing.combinator._
+package scala.util.parsing
+package combinator
+package lexical
-import scala.util.parsing.syntax._
-import scala.util.parsing.input.CharArrayReader.EofCh
+import token._
+import input.CharArrayReader.EofCh
/** <p>
* This component complements the <code>Scanners</code> component with
diff --git a/src/library/scala/util/parsing/combinator/lexical/Scanners.scala b/src/library/scala/util/parsing/combinator/lexical/Scanners.scala
index 07f4975cf8..96e9a76572 100644
--- a/src/library/scala/util/parsing/combinator/lexical/Scanners.scala
+++ b/src/library/scala/util/parsing/combinator/lexical/Scanners.scala
@@ -9,11 +9,12 @@
// $Id$
-package scala.util.parsing.combinator.lexical
-import scala.util.parsing.combinator._
+package scala.util.parsing
+package combinator
+package lexical
-import scala.util.parsing.syntax._
-import scala.util.parsing.input._
+import token._
+import input._
/** <p>
* This component provides core functionality for lexical parsers.
@@ -23,13 +24,6 @@ import scala.util.parsing.input._
* {@see StdLexical}, for more functionality.
* </p>
*
- * @requires token a parser that produces a token (from a stream of characters)
- * @requires whitespace a unit-parser for white-space
- * @provides Scanner essentially a parser that parses a stream of characters
- * to produce `Token's, which are typically passed to a
- * syntactical parser (which operates on `Token's, not on
- * individual characters).
- *
* @author Martin Odersky, Adriaan Moors
*/
trait Scanners extends Parsers {
diff --git a/src/library/scala/util/parsing/combinator/lexical/StdLexical.scala b/src/library/scala/util/parsing/combinator/lexical/StdLexical.scala
index 1bb3e7c83f..bc53e3731d 100644
--- a/src/library/scala/util/parsing/combinator/lexical/StdLexical.scala
+++ b/src/library/scala/util/parsing/combinator/lexical/StdLexical.scala
@@ -9,11 +9,12 @@
// $Id$
-package scala.util.parsing.combinator.lexical
-import scala.util.parsing.combinator._
+package scala.util.parsing
+package combinator
+package lexical
-import scala.util.parsing.syntax._
-import scala.util.parsing.input.CharArrayReader.EofCh
+import token._
+import input.CharArrayReader.EofCh
import collection.mutable.HashSet
/** <p>
diff --git a/src/library/scala/util/parsing/combinator/syntactical/StandardTokenParsers.scala b/src/library/scala/util/parsing/combinator/syntactical/StandardTokenParsers.scala
index 85c0592572..31fa06035c 100644
--- a/src/library/scala/util/parsing/combinator/syntactical/StandardTokenParsers.scala
+++ b/src/library/scala/util/parsing/combinator/syntactical/StandardTokenParsers.scala
@@ -8,12 +8,12 @@
// $Id$
+package scala.util.parsing
+package combinator
+package syntactical
-package scala.util.parsing.combinator.syntactical
-import scala.util.parsing.combinator._
-
-import scala.util.parsing.syntax._
-import scala.util.parsing.combinator.lexical.StdLexical
+import token._
+import lexical.StdLexical
/** This component provides primitive parsers for the standard tokens defined in `StdTokens'.
*
diff --git a/src/library/scala/util/parsing/combinator/syntactical/StdTokenParsers.scala b/src/library/scala/util/parsing/combinator/syntactical/StdTokenParsers.scala
index 544c7f08d5..5b62280b78 100644
--- a/src/library/scala/util/parsing/combinator/syntactical/StdTokenParsers.scala
+++ b/src/library/scala/util/parsing/combinator/syntactical/StdTokenParsers.scala
@@ -9,11 +9,12 @@
// $Id$
-package scala.util.parsing.combinator.syntactical
-import scala.util.parsing.combinator._
+package scala.util.parsing
+package combinator
+package syntactical
-import scala.util.parsing.syntax._
-import scala.collection.mutable.HashMap
+import token._
+import collection.mutable.HashMap
/** This component provides primitive parsers for the standard tokens defined in `StdTokens'.
*
diff --git a/src/library/scala/util/parsing/combinator/syntactical/TokenParsers.scala b/src/library/scala/util/parsing/combinator/syntactical/TokenParsers.scala
index 01557c32a7..ae4120b402 100644
--- a/src/library/scala/util/parsing/combinator/syntactical/TokenParsers.scala
+++ b/src/library/scala/util/parsing/combinator/syntactical/TokenParsers.scala
@@ -8,23 +8,17 @@
// $Id$
+package scala.util.parsing
+package combinator
+package syntactical
-package scala.util.parsing.combinator.syntactical
-import scala.util.parsing.combinator._
-
-/** <p>
- * This is the core component for token-based parsers.
- * </p>
- * <p>
- * @requires lexical a component providing the tokens consumed by the
- * parsers in this component.
- * </p>
+/** This is the core component for token-based parsers.
*
* @author Martin Odersky, Adriaan Moors
*/
trait TokenParsers extends Parsers {
/** Tokens is the abstract type of the `Token's consumed by the parsers in this component*/
- type Tokens <: scala.util.parsing.syntax.Tokens
+ type Tokens <: token.Tokens
/** lexical is the component responsible for consuming some basic kind of
* input (usually character-based) and turning it into the tokens
diff --git a/src/library/scala/util/parsing/syntax/StdTokens.scala b/src/library/scala/util/parsing/combinator/token/StdTokens.scala
index 2321082b92..ea565235d1 100644
--- a/src/library/scala/util/parsing/syntax/StdTokens.scala
+++ b/src/library/scala/util/parsing/combinator/token/StdTokens.scala
@@ -6,7 +6,9 @@
** |/ **
\* */
-package scala.util.parsing.syntax
+package scala.util.parsing
+package combinator
+package token
/** This component provides the standard `Token's for a simple, Scala-like language.
*
diff --git a/src/library/scala/util/parsing/syntax/Tokens.scala b/src/library/scala/util/parsing/combinator/token/Tokens.scala
index fdc6385b6e..b7a568efea 100644
--- a/src/library/scala/util/parsing/syntax/Tokens.scala
+++ b/src/library/scala/util/parsing/combinator/token/Tokens.scala
@@ -6,7 +6,9 @@
** |/ **
\* */
-package scala.util.parsing.syntax
+package scala.util.parsing
+package combinator
+package token
/** This component provides the notion of `Token', the unit of information that is passed from lexical
* parsers in the `Lexical' component to the parsers in the `TokenParsers' component.
diff --git a/src/library/scala/util/parsing/input/Position.scala b/src/library/scala/util/parsing/input/Position.scala
index 6922bec19c..482610ca28 100644
--- a/src/library/scala/util/parsing/input/Position.scala
+++ b/src/library/scala/util/parsing/input/Position.scala
@@ -53,7 +53,7 @@ trait Position {
*<pre> List(this, is, a, line, from, the, document)
* ^</pre>
*/
- def longString = lineContents+"\n"+(" " * (column - 1))+"^"
+ def longString = lineContents+"\n"+lineContents.take(column-1).map{x => if (x == '\t') x else ' ' } + "^"
/** Compare this position to another, by first comparing their line numbers,
* and then -- if necessary -- using the columns to break a tie.
diff --git a/src/library/scala/util/parsing/syntax/package.scala b/src/library/scala/util/parsing/syntax/package.scala
new file mode 100644
index 0000000000..9dc909ca60
--- /dev/null
+++ b/src/library/scala/util/parsing/syntax/package.scala
@@ -0,0 +1,19 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2006-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+package scala.util.parsing
+
+import scala.util.parsing.combinator.token
+
+/** If deprecating the whole package worked, that's what would best
+ * be done, but it doesn't (yet) so it isn't.
+ */
+package object syntax {
+ @deprecated("Moved to scala.util.parsing.combinator.token") type Tokens = token.Tokens
+ @deprecated("Moved to scala.util.parsing.combinator.token") type StdTokens = token.StdTokens
+}
diff --git a/src/library/scala/xml/Atom.scala b/src/library/scala/xml/Atom.scala
index 129a0803d2..7c66995573 100644
--- a/src/library/scala/xml/Atom.scala
+++ b/src/library/scala/xml/Atom.scala
@@ -8,9 +8,7 @@
// $Id$
-
package scala.xml
-import collection.mutable.StringBuilder
/** The class <code>Atom</code> provides an XML node for text (PCDATA).
* It is used in both non-bound and bound XML representations.
@@ -24,17 +22,21 @@ class Atom[+A](val data: A) extends SpecialNode
if (data == null)
throw new IllegalArgumentException("cannot construct Atom(null)")
+ override def basisForHashCode: Seq[Any] = Seq(data)
+ override def strict_==(other: Equality) = other match {
+ case x: Atom[_] => data == x.data
+ case _ => false
+ }
+ override def canEqual(other: Any) = other match {
+ case _: Atom[_] => true
+ case _ => false
+ }
+
final override def doCollectNamespaces = false
final override def doTransform = false
def label = "#PCDATA"
- override def equals(x: Any) = x match {
- case s:Atom[_] => data == s.data
- case _ => false
- }
- override def hashCode() = data.hashCode()
-
/** Returns text, with some characters escaped according to the XML
* specification.
*
diff --git a/src/library/scala/xml/Attribute.scala b/src/library/scala/xml/Attribute.scala
index 8ff9fb2ed7..3259526d98 100644
--- a/src/library/scala/xml/Attribute.scala
+++ b/src/library/scala/xml/Attribute.scala
@@ -10,9 +10,6 @@
package scala.xml
-import collection.Seq
-import collection.mutable.StringBuilder
-
/** Attribute defines the interface shared by both
* PrefixedAttribute and UnprefixedAttribute
*/
@@ -45,6 +42,7 @@ object Attribute {
abstract trait Attribute extends MetaData
{
+ def pre: String // will be null if unprefixed
val key: String
val value: Seq[Node]
val next: MetaData
@@ -52,13 +50,43 @@ abstract trait Attribute extends MetaData
def apply(key: String): Seq[Node]
def apply(namespace: String, scope: NamespaceBinding, key: String): Seq[Node]
def copy(next: MetaData): Attribute
- def remove(key: String): MetaData
- def remove(namespace: String, scope: NamespaceBinding, key: String): MetaData
- def isPrefixed: Boolean
+ def remove(key: String) =
+ if (!isPrefixed && this.key == key) next
+ else copy(next remove key)
+
+ def remove(namespace: String, scope: NamespaceBinding, key: String) =
+ if (isPrefixed && this.key == key && (scope getURI pre) == namespace) next
+ else next.remove(namespace, scope, key)
+
+ def isPrefixed: Boolean = pre != null
def getNamespace(owner: Node): String
- def wellformed(scope: NamespaceBinding): Boolean
+ def wellformed(scope: NamespaceBinding): Boolean = {
+ val arg = if (isPrefixed) scope getURI pre else null
+ (next(arg, scope, key) == null) && (next wellformed scope)
+ }
- def equals1(m: MetaData): Boolean
- def toString1(sb: StringBuilder): Unit
+ override def canEqual(other: Any) = other match {
+ case _: Attribute => true
+ case _ => false
+ }
+ override def strict_==(other: Equality) = other match {
+ case x: Attribute => (pre == x.pre) && (key == x.key) && (value sameElements x.value)
+ case _ => false
+ }
+ override def basisForHashCode = List(pre, key, value)
+
+ /** Appends string representation of only this attribute to stringbuffer.
+ */
+ def toString1(sb: StringBuilder) {
+ if (value == null)
+ return
+ if (isPrefixed)
+ sb append pre append ':'
+
+ sb append key append '='
+ val sb2 = new StringBuilder()
+ Utility.sequenceToXML(value, TopScope, sb2, true)
+ Utility.appendQuoted(sb2.toString(), sb)
+ }
}
diff --git a/src/library/scala/xml/Comment.scala b/src/library/scala/xml/Comment.scala
index 4e8cff8d75..9608748601 100644
--- a/src/library/scala/xml/Comment.scala
+++ b/src/library/scala/xml/Comment.scala
@@ -10,7 +10,7 @@
package scala.xml
-import collection.mutable.StringBuilder
+
/** The class <code>Comment</code> implements an XML node for comments.
*
diff --git a/src/library/scala/xml/Document.scala b/src/library/scala/xml/Document.scala
index 3ac50b80b7..6c73252a37 100644
--- a/src/library/scala/xml/Document.scala
+++ b/src/library/scala/xml/Document.scala
@@ -87,4 +87,8 @@ class Document extends NodeSeq with pull.XMLEvent {
def theSeq: Seq[Node] = this.docElem
+ override def canEqual(other: Any) = other match {
+ case _: Document => true
+ case _ => false
+ }
}
diff --git a/src/library/scala/xml/Elem.scala b/src/library/scala/xml/Elem.scala
index 18b513527c..9c58177417 100644
--- a/src/library/scala/xml/Elem.scala
+++ b/src/library/scala/xml/Elem.scala
@@ -8,11 +8,8 @@
// $Id$
-
package scala.xml
-import collection.Seq
-
/** This singleton object contains the apply and unapplySeq methods for convenient construction and
* deconstruction. It is possible to deconstruct any Node instance (that is not a SpecialNode or
* a Group) using the syntax
@@ -26,8 +23,10 @@ object Elem
def apply(prefix: String,label: String, attributes: MetaData, scope: NamespaceBinding, child: Node*) =
new Elem(prefix,label,attributes,scope,child:_*)
- def unapplySeq(n:Node) = if (n.isInstanceOf[SpecialNode] || n.isInstanceOf[Group]) None else
- Some((n.prefix, n.label, n.attributes, n.scope, n.child))
+ def unapplySeq(n: Node) = n match {
+ case _: SpecialNode | _: Group => None
+ case _ => Some((n.prefix, n.label, n.attributes, n.scope, n.child))
+ }
}
/** The case class <code>Elem</code> extends the <code>Node</code> class,
@@ -54,18 +53,17 @@ extends Node
final override def doCollectNamespaces = true
final override def doTransform = true
- if ((null != prefix) && 0 == prefix.length())
+ if (prefix == "")
throw new IllegalArgumentException("prefix of zero length, use null instead")
- if (null == scope)
- throw new IllegalArgumentException("scope is null, try xml.TopScope for empty scope")
+ if (scope == null)
+ throw new IllegalArgumentException("scope is null, use xml.TopScope for empty scope")
//@todo: copy the children,
// setting namespace scope if necessary
// cleaning adjacent text nodes if necessary
- override def hashCode(): Int =
- Utility.hashCode(prefix, label, attributes.hashCode(), scope.hashCode(), child)
+ override def basisForHashCode: Seq[Any] = prefix :: label :: attributes :: child.toList
/** Returns a new element with updated attributes, resolving namespace uris from this element's scope.
* See MetaData.update for details.
diff --git a/src/library/scala/xml/EntityRef.scala b/src/library/scala/xml/EntityRef.scala
index 0806b8fa68..fbc1f351cf 100644
--- a/src/library/scala/xml/EntityRef.scala
+++ b/src/library/scala/xml/EntityRef.scala
@@ -10,7 +10,7 @@
package scala.xml
-import collection.mutable.StringBuilder
+
/** The class <code>EntityRef</code> implements an XML node for entity
diff --git a/src/library/scala/xml/Equality.scala b/src/library/scala/xml/Equality.scala
new file mode 100644
index 0000000000..d09ae10b2d
--- /dev/null
+++ b/src/library/scala/xml/Equality.scala
@@ -0,0 +1,115 @@
+/* __ *\
+** ________ ___ / / ___ Scala API **
+** / __/ __// _ | / / / _ | (c) 2003-2010, LAMP/EPFL **
+** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
+** /____/\___/_/ |_/____/_/ | | **
+** |/ **
+\* */
+
+package scala.xml
+
+/** In an attempt to contain the damage being inflicted on
+ * consistency by the ad hoc equals methods spread around
+ * xml, the logic is centralized and all the xml classes
+ * go through the xml.Equality trait. There are two forms
+ * of xml comparison.
+ *
+ * 1) def strict_==(other: xml.Equality)
+ *
+ * This one tries to honor the little things like symmetry
+ * and hashCode contracts. The equals method routes all
+ * comparisons through this.
+ *
+ * 2) xml_==(other: Any)
+ *
+ * This one picks up where strict_== leaves off. It might
+ * declare any two things equal.
+ *
+ * As things stood, the logic not only made a mockery of
+ * the collections equals contract, but also laid waste to
+ * that of case classes.
+ *
+ * Among the obstacles to sanity are/were:
+ *
+ * Node extends NodeSeq extends Seq[Node]
+ * MetaData extends Iterable[MetaData]
+ * The hacky "Group" xml node which throws exceptions
+ * with wild abandon, so don't get too close
+ * Rampant asymmetry and impossible hashCodes
+ * Most classes claiming to be equal to "String" if
+ * some specific stringification of it was the same.
+ * String was never going to return the favor.
+ */
+
+object Equality {
+ def asRef(x: Any): AnyRef = x.asInstanceOf[AnyRef]
+
+ /** Note - these functions assume strict equality has already failed.
+ */
+ def compareBlithely(x1: AnyRef, x2: String): Boolean = x1 match {
+ case x: Atom[_] => x.data == x2
+ case x: NodeSeq => x.text == x2
+ case _ => false
+ }
+ def compareBlithely(x1: AnyRef, x2: Node): Boolean = x1 match {
+ case x: NodeSeq if x.length == 1 => x2 == x(0)
+ case _ => false
+ }
+ def compareBlithely(x1: AnyRef, x2: AnyRef): Boolean = {
+ if (x1 == null || x2 == null)
+ return (x1 eq x2)
+
+ x2 match {
+ case s: String => compareBlithely(x1, s)
+ case n: Node => compareBlithely(x1, n)
+ case _ => false
+ }
+ }
+}
+import Equality._
+
+private[xml]
+trait Equality extends scala.Equals {
+ def basisForHashCode: Seq[Any]
+ def strict_==(other: Equality): Boolean
+ def strict_!=(other: Equality) = !strict_==(other)
+
+ private def hashOf(x: Any) = if (x == null) 1 else x.hashCode()
+
+ /** We insist we're only equal to other xml.Equality implementors,
+ * which heads off a lot of inconsistency up front.
+ */
+ override def canEqual(other: Any): Boolean = other match {
+ case x: Equality => true
+ case _ => false
+ }
+
+ /** It's be nice to make these final, but there are probably
+ * people out there subclassing the XML types, especially when
+ * it comes to equals. However WE at least can pretend they
+ * are final since clearly individual classes cannot be trusted
+ * to maintain a semblance of order.
+ */
+ override def hashCode() = basisForHashCode match {
+ case Nil => 0
+ case x :: xs => hashOf(x) * 41 + (xs map hashOf).foldLeft(0)(_ * 7 + _)
+ }
+ override def equals(other: Any) = doComparison(other, false)
+ final def xml_==(other: Any) = doComparison(other, true)
+ final def xml_!=(other: Any) = !xml_==(other)
+
+ /** The "blithe" parameter expresses the caller's unconcerned attitude
+ * regarding the usual constraints on equals. The method is thereby
+ * given carte blanche to declare any two things equal.
+ */
+ private def doComparison(other: Any, blithe: Boolean) = {
+ val strictlyEqual = other match {
+ case x: AnyRef if this eq x => true
+ case x: Equality => (x canEqual this) && (this strict_== x)
+ case _ => false
+ }
+
+ strictlyEqual || (blithe && compareBlithely(this, asRef(other)))
+ }
+}
+
diff --git a/src/library/scala/xml/Group.scala b/src/library/scala/xml/Group.scala
index 91ddba6ce6..8b714d2813 100644
--- a/src/library/scala/xml/Group.scala
+++ b/src/library/scala/xml/Group.scala
@@ -8,9 +8,7 @@
// $Id$
-
package scala.xml
-import collection.Seq
/** A hack to group XML nodes in one node for output.
*
@@ -18,49 +16,27 @@ import collection.Seq
* @version 1.0
*/
@serializable
-case class Group(val nodes: Seq[Node]) extends Node {
- // final override def doTransform = false
+final case class Group(val nodes: Seq[Node]) extends Node {
override def theSeq = nodes
- /** XXX this is ridiculous, we can't do equality like this. */
- override def equals(x: Any) = x match {
- case z:Group => (length == z.length) && sameElements(z)
- case z:Node => (length == 1) && z == apply(0)
- case z:Seq[_] => sameElements(z)
- case z:String => text == z
- case _ => false
+ override def canEqual(other: Any) = other match {
+ case x: Group => true
+ case _ => false
}
- /* As if there were a hashCode which could back up the above implementation! */
- override def hashCode = nodes.hashCode
-
- /**
- * @throws Predef.UnsupportedOperationException (always)
- */
- final def label =
- throw new UnsupportedOperationException("class Group does not support method 'label'")
-
- /**
- * @throws Predef.UnsupportedOperationException (always)
- */
- final override def attributes =
- throw new UnsupportedOperationException("class Group does not support method 'attributes'")
-
- /**
- * @throws Predef.UnsupportedOperationException (always)
- */
- final override def namespace =
- throw new UnsupportedOperationException("class Group does not support method 'namespace'")
+ override def strict_==(other: Equality) = other match {
+ case Group(xs) => nodes sameElements xs
+ case _ => false
+ }
+ override def basisForHashCode = nodes
- /**
- * @throws Predef.UnsupportedOperationException (always)
+ /** Since Group is very much a hack it throws an exception if you
+ * try to do anything with it.
*/
- final override def child =
- throw new UnsupportedOperationException("class Group does not support method 'child'")
+ private def fail(msg: String) = throw new UnsupportedOperationException("class Group does not support method '%s'" format msg)
- /**
- * @throws Predef.UnsupportedOperationException (always)
- */
- def buildString(sb: StringBuilder) =
- throw new UnsupportedOperationException(
- "class Group does not support method toString(StringBuilder)")
+ def label = fail("label")
+ override def attributes = fail("attributes")
+ override def namespace = fail("namespace")
+ override def child = fail("child")
+ def buildString(sb: StringBuilder) = fail("toString(StringBuilder)")
}
diff --git a/src/library/scala/xml/MetaData.scala b/src/library/scala/xml/MetaData.scala
index 01b1cbc1d9..744b662fb8 100644
--- a/src/library/scala/xml/MetaData.scala
+++ b/src/library/scala/xml/MetaData.scala
@@ -8,14 +8,10 @@
// $Id$
-
package scala.xml
import Utility.sbToString
import annotation.tailrec
-import collection.immutable.List
-import collection.{Seq, Iterator, Iterable}
-import collection.mutable.StringBuilder
/**
@@ -77,7 +73,7 @@ object MetaData {
* @author Burak Emir <bqe@google.com>
*/
@serializable
-abstract class MetaData extends Iterable[MetaData]
+abstract class MetaData extends Iterable[MetaData] with Equality
{
/** Updates this MetaData with the MetaData given as argument. All attributes that occur in updates
* are part of the resulting MetaData. If an attribute occurs in both this instance and
@@ -118,13 +114,6 @@ abstract class MetaData extends Iterable[MetaData]
*/
def apply(namespace_uri:String, scp:NamespaceBinding, k:String): Seq[Node]
- /**
- * @param m ...
- * @return <code>true</code> iff ...
- */
- def containedIn1(m: MetaData): Boolean =
- m != null && (m.equals1(this) || containedIn1(m.next))
-
/** returns a copy of this MetaData item with next field set to argument.
*
* @param next ...
@@ -143,22 +132,20 @@ abstract class MetaData extends Iterable[MetaData]
def isPrefixed: Boolean
- /** deep equals method - XXX */
- override def equals(that: Any) = that match {
- case m: MetaData =>
- (this.length == m.length) &&
- (this.hashCode == m.hashCode) &&
- (this forall (_ containedIn1 m))
+ override def canEqual(other: Any) = other match {
+ case _: MetaData => true
+ case _ => false
+ }
+ override def strict_==(other: Equality) = other match {
+ case m: MetaData => this.toSet == m.toSet
case _ => false
}
+ def basisForHashCode: Seq[Any] = List(this.toSet)
/** Returns an iterator on attributes */
- def iterator: Iterator[MetaData] = Iterator.iterate(this)(_.next) takeWhile (_ != Null)
+ def iterator: Iterator[MetaData] = Iterator.single(this) ++ next.iterator
override def size: Int = 1 + iterator.length
- /** shallow equals method */
- def equals1(that: MetaData): Boolean
-
/** filters this sequence of meta data */
override def filter(f: MetaData => Boolean): MetaData =
if (f(this)) copy(next filter f)
@@ -170,8 +157,18 @@ abstract class MetaData extends Iterable[MetaData]
/** returns value of this MetaData item */
def value: Seq[Node]
- /** maps this sequence of meta data */
- def map(f: MetaData => Text): List[Text] = (iterator map f).toList
+ /** Returns a String containing "prefix:key" if the first key is
+ * prefixed, and "key" otherwise.
+ */
+ def prefixedKey = this match {
+ case x: Attribute if x.isPrefixed => x.pre + ":" + key
+ case _ => key
+ }
+
+ /** Returns a Map containing the attributes stored as key/value pairs.
+ */
+ def asAttrMap: Map[String, String] =
+ iterator map (x => (x.prefixedKey, x.value.text)) toMap
/** returns Null or the next MetaData item */
def next: MetaData
@@ -198,8 +195,6 @@ abstract class MetaData extends Iterable[MetaData]
final def get(uri: String, scope: NamespaceBinding, key: String): Option[Seq[Node]] =
Option(apply(uri, scope, key))
- override def hashCode(): Int
-
def toString1(): String = sbToString(toString1)
// appends string representations of single attribute to StringBuilder
diff --git a/src/library/scala/xml/NamespaceBinding.scala b/src/library/scala/xml/NamespaceBinding.scala
index 7381f0129b..47ca8fed87 100644
--- a/src/library/scala/xml/NamespaceBinding.scala
+++ b/src/library/scala/xml/NamespaceBinding.scala
@@ -12,7 +12,7 @@
package scala.xml
import Utility.sbToString
-import collection.mutable.StringBuilder
+
/** The class <code>NamespaceBinding</code> represents namespace bindings
* and scopes. The binding for the default namespace is treated as a null
@@ -23,7 +23,7 @@ import collection.mutable.StringBuilder
* @version 1.0
*/
@SerialVersionUID(0 - 2518644165573446725L)
-case class NamespaceBinding(prefix: String, uri: String, parent: NamespaceBinding) extends AnyRef
+case class NamespaceBinding(prefix: String, uri: String, parent: NamespaceBinding) extends AnyRef with Equality
{
if (prefix == "")
throw new IllegalArgumentException("zero length prefix not allowed")
@@ -41,6 +41,15 @@ case class NamespaceBinding(prefix: String, uri: String, parent: NamespaceBindin
if (_uri == uri) prefix else parent getPrefix _uri
override def toString(): String = sbToString(buildString(_, TopScope))
+ override def canEqual(other: Any) = other match {
+ case _: NamespaceBinding => true
+ case _ => false
+ }
+ override def strict_==(other: Equality) = other match {
+ case x: NamespaceBinding => (prefix == x.prefix) && (uri == x.uri) && (parent == x.parent)
+ case _ => false
+ }
+ def basisForHashCode: Seq[Any] = List(prefix, uri, parent)
def buildString(stop: NamespaceBinding): String = sbToString(buildString(_, stop))
def buildString(sb: StringBuilder, stop: NamespaceBinding): Unit = {
diff --git a/src/library/scala/xml/Node.scala b/src/library/scala/xml/Node.scala
index f206140fd4..5117bb9282 100644
--- a/src/library/scala/xml/Node.scala
+++ b/src/library/scala/xml/Node.scala
@@ -8,13 +8,8 @@
// $Id$
-
package scala.xml
-import collection.Seq
-import collection.immutable.{List, Nil}
-import collection.mutable.StringBuilder
-
/**
* This object provides methods ...
*
@@ -22,7 +17,6 @@ import collection.mutable.StringBuilder
* @version 1.0
*/
object Node {
-
/** the constant empty attribute sequence */
final def NoAttributes: MetaData = Null
@@ -30,7 +24,6 @@ object Node {
val EmptyNamespace = ""
def unapplySeq(n: Node) = Some((n.label, n.attributes, n.child))
-
}
/**
@@ -116,6 +109,10 @@ abstract class Node extends NodeSeq {
*/
def child: Seq[Node]
+ /** Children which do not stringify to "" (needed for equality)
+ */
+ def nonEmptyChildren: Seq[Node] = child filterNot (_.toString == "")
+
/**
* Descendant axis (all descendants of this node, not including node itself)
* includes all text nodes, element nodes, comments and processing instructions.
@@ -129,41 +126,24 @@ abstract class Node extends NodeSeq {
*/
def descendant_or_self: List[Node] = this :: descendant
- /**
- * Returns true if x is structurally equal to this node. Compares prefix,
- * label, attributes and children.
- *
- * @param x ...
- * @return <code>true</code> if ..
- */
- override def equals(x: Any): Boolean = x match {
- case g: Group => false
- case that: Node =>
- this.prefix == that.prefix &&
- this.label == that.label &&
- this.attributes == that.attributes &&
- this.scope == that.scope &&
- equalChildren(that)
+ override def canEqual(other: Any) = other match {
+ case x: Group => false
+ case x: Node => true
case _ => false
}
-
- // children comparison has to be done carefully - see bug #1773.
- // It would conceivably be a better idea for a scala block which
- // generates the empty string not to generate a child rather than
- // our having to filter it later, but that approach would be more
- // delicate to implement.
- private def equalChildren(that: Node) = {
- def noEmpties(xs: Seq[Node]) = xs filter (_.toString() != "")
- noEmpties(this.child) sameElements noEmpties(that.child)
+ override def basisForHashCode: Seq[Any] = prefix :: label :: attributes :: nonEmptyChildren.toList
+ override def strict_==(other: Equality) = other match {
+ case _: Group => false
+ case x: Node =>
+ (prefix == x.prefix) &&
+ (label == x.label) &&
+ (attributes == x.attributes) &&
+ // (scope == x.scope) // note - original code didn't compare scopes so I left it as is.
+ (nonEmptyChildren sameElements x.nonEmptyChildren)
+ case _ =>
+ false
}
- /** <p>
- * Returns a hashcode.
- * </p>
- */
- override def hashCode(): Int =
- Utility.hashCode(prefix, label, attributes.hashCode(), scope.hashCode(), child)
-
// implementations of NodeSeq methods
/**
@@ -213,9 +193,10 @@ abstract class Node extends NodeSeq {
* Martin to Burak: to do: if you make this method abstract, the compiler will now
* complain if there's no implementation in a subclass. Is this what we want? Note that
* this would break doc/DocGenator and doc/ModelToXML, with an error message like:
-doc\DocGenerator.scala:1219: error: object creation impossible, since there is a deferred declaration of method text in class Node of type => String which is not implemented in a subclass
- new SpecialNode {
- ^
- */
+ * {{{
+ * doc\DocGenerator.scala:1219: error: object creation impossible, since there is a deferred declaration of method text in class Node of type => String which is not implemented in a subclass
+ * new SpecialNode {
+ * ^
+ * }}} */
override def text: String = super.text
}
diff --git a/src/library/scala/xml/NodeBuffer.scala b/src/library/scala/xml/NodeBuffer.scala
index 49efe0a5ca..2cf999e8a4 100644
--- a/src/library/scala/xml/NodeBuffer.scala
+++ b/src/library/scala/xml/NodeBuffer.scala
@@ -8,11 +8,8 @@
// $Id$
-
package scala.xml
-import collection.{Iterator, Seq, Iterable}
-
/**
* <p>
* This class acts as a Buffer for nodes. If it is used as a sequence
diff --git a/src/library/scala/xml/NodeSeq.scala b/src/library/scala/xml/NodeSeq.scala
index 17ea9228f6..3b56ba25e4 100644
--- a/src/library/scala/xml/NodeSeq.scala
+++ b/src/library/scala/xml/NodeSeq.scala
@@ -11,11 +11,9 @@
package scala.xml
-import collection.immutable
-import collection.immutable.{List, Nil, ::}
-import collection.{Seq, SeqLike}
-import collection.mutable.{Builder, ListBuffer}
-import collection.generic.CanBuildFrom
+import collection.{ mutable, immutable, generic, SeqLike }
+import mutable.{ Builder, ListBuffer }
+import generic.{ CanBuildFrom }
/** This object ...
*
@@ -43,7 +41,7 @@ object NodeSeq {
* @author Burak Emir
* @version 1.0
*/
-abstract class NodeSeq extends immutable.Seq[Node] with SeqLike[Node, NodeSeq] {
+abstract class NodeSeq extends immutable.Seq[Node] with SeqLike[Node, NodeSeq] with Equality {
import NodeSeq.seqToNodeSeq // import view magic for NodeSeq wrappers
/** Creates a list buffer as builder for this class */
@@ -56,12 +54,23 @@ abstract class NodeSeq extends immutable.Seq[Node] with SeqLike[Node, NodeSeq] {
def apply(i: Int): Node = theSeq(i)
def apply(f: Node => Boolean): NodeSeq = filter(f)
- /** structural equality (XXX - this shatters any hope of hashCode equality) */
- override def equals(x: Any): Boolean = x match {
- case z:Node => (length == 1) && z == apply(0)
- case z:Seq[_] => sameElements(z)
- case z:String => text == z
- case _ => false
+ def xml_sameElements[A](that: Iterable[A]): Boolean = {
+ val these = this.iterator
+ val those = that.iterator
+ while (these.hasNext && those.hasNext)
+ if (these.next xml_!= those.next)
+ return false
+
+ !these.hasNext && !those.hasNext
+ }
+ def basisForHashCode: Seq[Any] = theSeq
+ override def canEqual(other: Any) = other match {
+ case _: NodeSeq => true
+ case _ => false
+ }
+ override def strict_==(other: Equality) = other match {
+ case x: NodeSeq => (length == x.length) && (theSeq sameElements x.theSeq)
+ case _ => false
}
/** Projection function. Similar to XPath, use <code>this \ "foo"</code>
@@ -80,8 +89,8 @@ abstract class NodeSeq extends immutable.Seq[Node] with SeqLike[Node, NodeSeq] {
* @return ...
*/
def \(that: String): NodeSeq = {
+ def fail = throw new IllegalArgumentException(that)
def atResult = {
- def fail = throw new IllegalArgumentException(that)
lazy val y = this(0)
val attr =
if (that.length == 1) fail
@@ -92,7 +101,7 @@ abstract class NodeSeq extends immutable.Seq[Node] with SeqLike[Node, NodeSeq] {
if (uri == "" || key == "") fail
else y.attribute(uri, key)
}
- else y.attribute(that.substring(1))
+ else y.attribute(that drop 1)
attr match {
case Some(x) => Group(x)
@@ -104,6 +113,7 @@ abstract class NodeSeq extends immutable.Seq[Node] with SeqLike[Node, NodeSeq] {
NodeSeq fromSeq (this flatMap (_.child) filter cond)
that match {
+ case "" => fail
case "_" => makeSeq(!_.isAtom)
case _ if (that(0) == '@' && this.length == 1) => atResult
case _ => makeSeq(_.label == that)
diff --git a/src/library/scala/xml/Null.scala b/src/library/scala/xml/Null.scala
index a3246d4b57..d6f06fc3cd 100644
--- a/src/library/scala/xml/Null.scala
+++ b/src/library/scala/xml/Null.scala
@@ -12,70 +12,49 @@
package scala.xml
import Utility.{ isNameStart }
-import collection.Iterator
-import collection.immutable.{Nil, List}
-import collection.mutable.StringBuilder
+/** Essentially, every method in here is a dummy, returning Zero[T].
+ * It provides a backstop for the unusual collection defined by MetaData,
+ * sort of a linked list of tails.
+ */
case object Null extends MetaData {
-
- /** appends given MetaData items to this MetaData list */
- override def append(m: MetaData, scope: NamespaceBinding = TopScope): MetaData = m
-
- override def containedIn1(m: MetaData): Boolean = false
-
- /** returns its argument */
- def copy(next: MetaData) = next
-
override def iterator = Iterator.empty
-
+ override def append(m: MetaData, scope: NamespaceBinding = TopScope): MetaData = m
override def filter(f: MetaData => Boolean): MetaData = this
+ def copy(next: MetaData) = next
def getNamespace(owner: Node) = null
- final override def hasNext = false
+ override def hasNext = false
def next = null
def key = null
def value = null
-
- final override def length = 0
- final override def length(i: Int) = i
-
def isPrefixed = false
- /** deep equals method - XXX */
- override def equals(that: Any) = that match {
- case m: MetaData => m.length == 0
- case _ => false
- }
+ override def length = 0
+ override def length(i: Int) = i
- def equals1(that:MetaData) = that.length == 0
-
- override def map(f: MetaData => Text): List[Text] = Nil
+ override def strict_==(other: Equality) = other match {
+ case x: MetaData => x.length == 0
+ case _ => false
+ }
+ override def basisForHashCode: Seq[Any] = Nil
- /** null */
+ def apply(namespace: String, scope: NamespaceBinding, key: String) = null
def apply(key: String) = {
- if(!isNameStart(key charAt 0))
+ if (!isNameStart(key.head))
throw new IllegalArgumentException("not a valid attribute name '"+key+"', so can never match !")
+
null
}
- /** gets value of qualified (prefixed) attribute with given key */
- def apply(namespace: String, scope: NamespaceBinding, key: String) = null
-
- override def hashCode(): Int = 0
-
+ def toString1(sb: StringBuilder) = ()
override def toString1(): String = ""
-
- //appends string representations of single attribute to StringBuilder
- def toString1(sb:StringBuilder) = {}
-
override def toString(): String = ""
override def buildString(sb: StringBuilder): StringBuilder = sb
-
override def wellformed(scope: NamespaceBinding) = true
def remove(key: String) = this
-
def remove(namespace: String, scope: NamespaceBinding, key: String) = this
}
diff --git a/src/library/scala/xml/PCData.scala b/src/library/scala/xml/PCData.scala
index 5cf4bda070..fa44591496 100644
--- a/src/library/scala/xml/PCData.scala
+++ b/src/library/scala/xml/PCData.scala
@@ -7,16 +7,9 @@ package scala.xml
* and is to be preserved as CDATA section in the output.
*/
case class PCData(_data: String) extends Atom[String](_data) {
- /* The following code is a derivative work of scala.xml.Text */
if (null == data)
throw new IllegalArgumentException("tried to construct PCData with null")
- final override def equals(x: Any) = x match {
- case s: String => s.equals(data)
- case s: Atom[_] => data == s.data
- case _ => false
- }
-
/** Returns text, with some characters escaped according to the XML
* specification.
*
diff --git a/src/library/scala/xml/PrefixedAttribute.scala b/src/library/scala/xml/PrefixedAttribute.scala
index a465e61ba3..d7c04ab6ad 100644
--- a/src/library/scala/xml/PrefixedAttribute.scala
+++ b/src/library/scala/xml/PrefixedAttribute.scala
@@ -8,13 +8,8 @@
// $Id$
-
package scala.xml
-import collection.Seq
-import collection.mutable.StringBuilder
-
-
/** prefixed attributes always have a non-null namespace.
*
* @param pre ...
@@ -36,24 +31,12 @@ extends Attribute
def this(pre: String, key: String, value: String, next: MetaData) =
this(pre, key, Text(value), next)
- /*
- // the problem here is the fact that we cannot remove the proper attribute from
- // next, and thus cannot guarantee that hashcodes are computed properly
- def this(pre: String, key: String, value: scala.AllRef, next: MetaData) =
- throw new UnsupportedOperationException("can't construct prefixed nil attributes")
- */
-
/** Returns a copy of this unprefixed attribute with the given
* next field.
*/
def copy(next: MetaData) =
new PrefixedAttribute(pre, key, value, next)
- def equals1(m: MetaData) =
- (m.isPrefixed &&
- (m.asInstanceOf[PrefixedAttribute].pre == pre) &&
- (m.key == key) && (m.value sameElements value))
-
def getNamespace(owner: Node) =
owner.getNamespace(pre)
@@ -68,41 +51,8 @@ extends Attribute
else
next(namespace, scope, key)
}
-
- /** returns true */
- final def isPrefixed = true
-
- /** returns the hashcode.
- */
- override def hashCode() =
- pre.hashCode() * 41 + key.hashCode() * 7 + next.hashCode()
-
-
- /** appends string representation of only this attribute to stringbuffer */
- def toString1(sb:StringBuilder): Unit = if(value ne null) {
- sb.append(pre)
- sb.append(':')
- sb.append(key)
- sb.append('=')
- val sb2 = new StringBuilder()
- Utility.sequenceToXML(value, TopScope, sb2, true)
- Utility.appendQuoted(sb2.toString(), sb)
- }
-
- def wellformed(scope: NamespaceBinding): Boolean =
- (null == next(scope.getURI(pre), scope, key) &&
- next.wellformed(scope))
-
- def remove(key: String) =
- copy(next.remove(key))
-
- def remove(namespace: String, scope: NamespaceBinding, key: String): MetaData =
- if (key == this.key && scope.getURI(pre) == namespace)
- next
- else
- next.remove(namespace, scope, key)
-
}
+
object PrefixedAttribute {
def unapply(x: PrefixedAttribute) = Some(x.pre, x.key, x.value, x.next)
}
diff --git a/src/library/scala/xml/PrettyPrinter.scala b/src/library/scala/xml/PrettyPrinter.scala
index 1fb922eb95..77199ca367 100644
--- a/src/library/scala/xml/PrettyPrinter.scala
+++ b/src/library/scala/xml/PrettyPrinter.scala
@@ -8,10 +8,8 @@
// $Id$
-
package scala.xml
-import scala.collection.Map
import Utility.sbToString
/** Class for pretty printing. After instantiating, you can use the
@@ -23,7 +21,7 @@ import Utility.sbToString
* @version 1.0
*
* @param width the width to fit the output into
- * @step indentation
+ * @param step indentation
*/
class PrettyPrinter(width: Int, step: Int) {
@@ -39,7 +37,6 @@ class PrettyPrinter(width: Int, step: Int) {
protected var items: List[Item] = Nil
protected var cur = 0
- //protected var pmap:Map[String,String] = _
protected def reset() = {
cur = 0
diff --git a/src/library/scala/xml/ProcInstr.scala b/src/library/scala/xml/ProcInstr.scala
index 5a4e67e647..051fd499f4 100644
--- a/src/library/scala/xml/ProcInstr.scala
+++ b/src/library/scala/xml/ProcInstr.scala
@@ -9,7 +9,6 @@
// $Id$
package scala.xml
-import collection.mutable.StringBuilder
/** an XML node for processing instructions (PI)
*
diff --git a/src/library/scala/xml/SpecialNode.scala b/src/library/scala/xml/SpecialNode.scala
index d40d829c4b..1688cd1e15 100644
--- a/src/library/scala/xml/SpecialNode.scala
+++ b/src/library/scala/xml/SpecialNode.scala
@@ -8,12 +8,8 @@
// $Id$
-
package scala.xml
-import collection.immutable.{List, Nil, ::}
-import collection.mutable.StringBuilder
-
/** <p>
* <code>SpecialNode</code> is a special XML node which
* represents either text (PCDATA), a comment, a PI, or an entity ref.
diff --git a/src/library/scala/xml/Text.scala b/src/library/scala/xml/Text.scala
index 3090883bb8..5f0b010c9f 100644
--- a/src/library/scala/xml/Text.scala
+++ b/src/library/scala/xml/Text.scala
@@ -8,11 +8,8 @@
// $Id$
-
package scala.xml
-import collection.mutable.StringBuilder
-
// XXX This attempt to make Text not a case class revealed a bug in the pattern
// matcher (see ticket #2883) so I've put the case back. (It was/is desirable that
// it not be a case class because it is using the antipattern of passing constructor
@@ -42,13 +39,6 @@ case class Text(_data: String) extends Atom[String](_data)
if (_data == null)
throw new IllegalArgumentException("tried to construct Text with null")
- /** XXX More hashCode flailing. */
- final override def equals(x: Any) = x match {
- case s:String => s == data
- case s:Atom[_] => data == s.data
- case _ => false
- }
-
/** Returns text, with some characters escaped according to the XML
* specification.
*
diff --git a/src/library/scala/xml/TextBuffer.scala b/src/library/scala/xml/TextBuffer.scala
index 17c40aad2f..84c6c24146 100644
--- a/src/library/scala/xml/TextBuffer.scala
+++ b/src/library/scala/xml/TextBuffer.scala
@@ -8,12 +8,8 @@
// $Id$
-
package scala.xml
-import collection.Seq
-import collection.mutable.StringBuilder
-import collection.immutable.{List, Nil, ::}
import Utility.isSpace
object TextBuffer {
diff --git a/src/library/scala/xml/TopScope.scala b/src/library/scala/xml/TopScope.scala
index c638b80b2d..8b3c1383c9 100644
--- a/src/library/scala/xml/TopScope.scala
+++ b/src/library/scala/xml/TopScope.scala
@@ -10,8 +10,6 @@
package scala.xml
-import collection.mutable.StringBuilder
-
/** top level namespace scope. only contains the predefined binding
* for the &quot;xml&quot; prefix which is bound to
* &quot;http://www.w3.org/XML/1998/namespace&quot;
diff --git a/src/library/scala/xml/Unparsed.scala b/src/library/scala/xml/Unparsed.scala
index a570b83fb5..d3c63172e8 100644
--- a/src/library/scala/xml/Unparsed.scala
+++ b/src/library/scala/xml/Unparsed.scala
@@ -22,13 +22,6 @@ class Unparsed(data: String) extends Atom[String](data)
if (null == data)
throw new IllegalArgumentException("tried to construct Unparsed with null")
- /** XXX another hashCode fail */
- final override def equals(x: Any) = x match {
- case s:String => s == data
- case s:Atom[_] => data == s.data
- case _ => false
- }
-
/** returns text, with some characters escaped according to XML spec */
override def buildString(sb: StringBuilder) = sb append data
}
diff --git a/src/library/scala/xml/UnprefixedAttribute.scala b/src/library/scala/xml/UnprefixedAttribute.scala
index 283cc3a1d0..a8720f13e1 100644
--- a/src/library/scala/xml/UnprefixedAttribute.scala
+++ b/src/library/scala/xml/UnprefixedAttribute.scala
@@ -8,13 +8,8 @@
// $Id$
-
package scala.xml
-import collection.Seq
-import collection.mutable.StringBuilder
-
-
/** Unprefixed attributes have the null namespace, and no prefix field
*
* @author Burak Emir
@@ -25,6 +20,7 @@ class UnprefixedAttribute(
next1: MetaData)
extends Attribute
{
+ final val pre = null
val next = if (value ne null) next1 else next1.remove(key)
/** same as this(key, Text(value), next) */
@@ -38,9 +34,6 @@ extends Attribute
/** returns a copy of this unprefixed attribute with the given next field*/
def copy(next: MetaData) = new UnprefixedAttribute(key, value, next)
- def equals1(m: MetaData) =
- !m.isPrefixed && (m.key == key) && (m.value sameElements value)
-
final def getNamespace(owner: Node): String = null
/**
@@ -62,33 +55,6 @@ extends Attribute
*/
def apply(namespace: String, scope: NamespaceBinding, key: String): Seq[Node] =
next(namespace, scope, key)
-
- override def hashCode() =
- key.hashCode() * 7 + { if(value ne null) value.hashCode() * 53 else 0 } + next.hashCode()
-
- final def isPrefixed = false
-
- /** appends string representation of only this attribute to stringbuffer.
- *
- * @param sb ..
- */
- def toString1(sb: StringBuilder): Unit = if (value ne null) {
- sb.append(key)
- sb.append('=')
- val sb2 = new StringBuilder()
- Utility.sequenceToXML(value, TopScope, sb2, true)
- Utility.appendQuoted(sb2.toString(), sb)
- }
-
- def wellformed(scope: NamespaceBinding): Boolean =
- (null == next(null, scope, key)) && next.wellformed(scope)
-
- def remove(key: String) =
- if (this.key == key) next else copy(next.remove(key))
-
- def remove(namespace: String, scope: NamespaceBinding, key: String): MetaData =
- next.remove(namespace, scope, key)
-
}
object UnprefixedAttribute {
def unapply(x: UnprefixedAttribute) = Some(x.key, x.value, x.next)
diff --git a/src/library/scala/xml/Utility.scala b/src/library/scala/xml/Utility.scala
index 1cfe9c79c9..48a23dc389 100644
--- a/src/library/scala/xml/Utility.scala
+++ b/src/library/scala/xml/Utility.scala
@@ -11,8 +11,8 @@
package scala.xml
-import collection.mutable.{Set, HashSet, StringBuilder}
-import collection.Seq
+import collection.mutable
+import mutable.{ Set, HashSet }
import parsing.XhtmlEntities
/**
@@ -84,7 +84,7 @@ object Utility extends AnyRef with parsing.TokenTests
object Escapes {
/** For reasons unclear escape and unescape are a long ways from
- being logical inverses. */
+ * being logical inverses. */
val pairs = Map(
"lt" -> '<',
"gt" -> '>',
@@ -106,11 +106,29 @@ object Utility extends AnyRef with parsing.TokenTests
* @param s ...
* @return ...
*/
- final def escape(text: String, s: StringBuilder): StringBuilder =
- text.foldLeft(s)((s, c) => escMap.get(c) match {
- case Some(str) => s append str
- case None => s append c
- })
+ final def escape(text: String, s: StringBuilder): StringBuilder = {
+ // Implemented per XML spec:
+ // http://www.w3.org/International/questions/qa-controls
+ // imperative code 3x-4x faster than current implementation
+ // dpp (David Pollak) 2010/02/03
+ val len = text.length
+ var pos = 0
+ while (pos < len) {
+ text.charAt(pos) match {
+ case '<' => s.append("&lt;")
+ case '>' => s.append("&gt;")
+ case '&' => s.append("&amp;")
+ case '"' => s.append("&quot;")
+ case '\n' => s.append('\n')
+ case '\r' => s.append('\r')
+ case '\t' => s.append('\t')
+ case c => if (c >= ' ') s.append(c)
+ }
+
+ pos += 1
+ }
+ s
+ }
/**
* Appends unescaped string to <code>s</code>, amp becomes &amp;
@@ -131,7 +149,7 @@ object Utility extends AnyRef with parsing.TokenTests
* @param nodes ...
* @return ...
*/
- def collectNamespaces(nodes: Seq[Node]): Set[String] =
+ def collectNamespaces(nodes: Seq[Node]): mutable.Set[String] =
nodes.foldLeft(new HashSet[String]) { (set, x) => collectNamespaces(x, set) ; set }
/**
@@ -140,7 +158,7 @@ object Utility extends AnyRef with parsing.TokenTests
* @param n ...
* @param set ...
*/
- def collectNamespaces(n: Node, set: Set[String]) {
+ def collectNamespaces(n: Node, set: mutable.Set[String]) {
if (n.doCollectNamespaces) {
set += n.namespace
for (a <- n.attributes) a match {
diff --git a/src/library/scala/xml/XML.scala b/src/library/scala/xml/XML.scala
index 1e85d4ae06..dd85b58e50 100644
--- a/src/library/scala/xml/XML.scala
+++ b/src/library/scala/xml/XML.scala
@@ -11,12 +11,10 @@
package scala.xml
-import scala.xml.parsing.NoBindingFactoryAdapter
-import scala.xml.factory.XMLLoader
-import org.xml.sax.InputSource
-import javax.xml.parsers.{ SAXParser, SAXParserFactory }
-import java.io.{File, FileDescriptor, FileInputStream, FileOutputStream}
-import java.io.{InputStream, Reader, StringReader, Writer}
+import parsing.NoBindingFactoryAdapter
+import factory.XMLLoader
+import java.io.{ File, FileDescriptor, FileInputStream, FileOutputStream }
+import java.io.{ InputStream, Reader, StringReader, Writer }
import java.nio.channels.Channels
import scala.util.control.Exception.ultimately
@@ -56,11 +54,11 @@ object XML extends XMLLoader[Elem]
@deprecated("Use save() instead")
final def saveFull(filename: String, node: Node, xmlDecl: Boolean, doctype: dtd.DocType): Unit =
- saveFull(filename, node, encoding, xmlDecl, doctype)
+ save(filename, node, encoding, xmlDecl, doctype)
@deprecated("Use save() instead")
final def saveFull(filename: String, node: Node, enc: String, xmlDecl: Boolean, doctype: dtd.DocType): Unit =
- saveFull(filename, node, enc, xmlDecl, doctype)
+ save(filename, node, enc, xmlDecl, doctype)
/** Saves a node to a file with given filename using given encoding
* optionally with xmldecl and doctype declaration.
@@ -82,7 +80,7 @@ object XML extends XMLLoader[Elem]
val fos = new FileOutputStream(filename)
val w = Channels.newWriter(fos.getChannel(), enc)
- ultimately({ w.close() ; fos.close() })(
+ ultimately(w.close())(
write(w, node, enc, xmlDecl, doctype)
)
}
diff --git a/src/library/scala/xml/dtd/ContentModel.scala b/src/library/scala/xml/dtd/ContentModel.scala
index d5d4b95cce..772d8ec599 100644
--- a/src/library/scala/xml/dtd/ContentModel.scala
+++ b/src/library/scala/xml/dtd/ContentModel.scala
@@ -13,10 +13,7 @@ package scala.xml
package dtd
import util.regexp.WordExp
-import util.automata.{DetWordAutom, SubsetConstruction, WordBerrySethi}
-import collection.mutable.{HashSet, StringBuilder}
-import collection.immutable.{List, Nil}
-import collection.Seq
+import util.automata._
import Utility.sbToString
import PartialFunction._
diff --git a/src/library/scala/xml/dtd/ContentModelParser.scala b/src/library/scala/xml/dtd/ContentModelParser.scala
index c260a9fc46..2b0df3f6a5 100644
--- a/src/library/scala/xml/dtd/ContentModelParser.scala
+++ b/src/library/scala/xml/dtd/ContentModelParser.scala
@@ -8,12 +8,9 @@
// $Id$
-
package scala.xml
package dtd
-import collection.immutable.List
-
/** Parser for regexps (content models in DTD element declarations) */
object ContentModelParser extends Scanner { // a bit too permissive concerning #PCDATA
diff --git a/src/library/scala/xml/dtd/DTD.scala b/src/library/scala/xml/dtd/DTD.scala
index 14c16f8489..0fde1188f3 100644
--- a/src/library/scala/xml/dtd/DTD.scala
+++ b/src/library/scala/xml/dtd/DTD.scala
@@ -8,31 +8,25 @@
// $Id$
-
package scala.xml
package dtd
-import scala.collection.mutable.{HashMap, Map}
+import collection.mutable
+import mutable.HashMap
/** A document type declaration.
*
* @author Burak Emir
*/
abstract class DTD {
-
- var externalID: ExternalID = null
-
- def notations: Seq[NotationDecl] = Nil
-
+ var externalID: ExternalID = null
+ var decls: List[Decl] = Nil
+ def notations: Seq[NotationDecl] = Nil
def unparsedEntities: Seq[EntityDecl] = Nil
- var elem: Map[String, ElemDecl] = new HashMap[String, ElemDecl]()
-
- var attr: Map[String, AttListDecl] = new HashMap[String, AttListDecl]()
-
- var ent: Map[String, EntityDecl] = new HashMap[String, EntityDecl]()
-
- var decls: List[Decl] = Nil
+ var elem: mutable.Map[String, ElemDecl] = new HashMap[String, ElemDecl]()
+ var attr: mutable.Map[String, AttListDecl] = new HashMap[String, AttListDecl]()
+ var ent: mutable.Map[String, EntityDecl] = new HashMap[String, EntityDecl]()
override def toString() =
"DTD [\n%s%s]".format(
diff --git a/src/library/scala/xml/dtd/Decl.scala b/src/library/scala/xml/dtd/Decl.scala
index 25ee30b356..2ac3d42a67 100644
--- a/src/library/scala/xml/dtd/Decl.scala
+++ b/src/library/scala/xml/dtd/Decl.scala
@@ -8,14 +8,10 @@
// $Id$
-
package scala.xml
package dtd
import Utility.sbToString
-import collection.immutable.List
-import collection.mutable.StringBuilder
-
abstract class Decl
@@ -114,7 +110,7 @@ case class IntDef(value:String) extends EntityDef {
val n = tmp.substring(ix, iz);
if( !Utility.isName( n ))
- throw new IllegalArgumentException("interal entity def: \""+n+"\" must be an XML Name");
+ throw new IllegalArgumentException("internal entity def: \""+n+"\" must be an XML Name");
tmp = tmp.substring(iz+1, tmp.length());
ix = tmp.indexOf('%');
diff --git a/src/library/scala/xml/dtd/DocType.scala b/src/library/scala/xml/dtd/DocType.scala
index dab1d9ff6b..7da38b3e73 100644
--- a/src/library/scala/xml/dtd/DocType.scala
+++ b/src/library/scala/xml/dtd/DocType.scala
@@ -8,12 +8,9 @@
// $Id$
-
package scala.xml
package dtd
-import collection.Seq
-
/** An XML node for document type declaration.
*
* @author Burak Emir
diff --git a/src/library/scala/xml/dtd/ElementValidator.scala b/src/library/scala/xml/dtd/ElementValidator.scala
index cc37e2b527..9ebed8d87c 100644
--- a/src/library/scala/xml/dtd/ElementValidator.scala
+++ b/src/library/scala/xml/dtd/ElementValidator.scala
@@ -95,7 +95,7 @@ class ElementValidator() extends Function1[Node,Boolean] {
}
/** check children, return true if conform to content model
- * @pre contentModel != null
+ * @note contentModel != null
*/
def check(nodes: Seq[Node]): Boolean = contentModel match {
case ANY => true
@@ -120,7 +120,7 @@ class ElementValidator() extends Function1[Node,Boolean] {
}
/** applies various validations - accumulates error messages in exc
- * @todo: fail on first error, ignore other errors (rearranging conditions)
+ * @todo fail on first error, ignore other errors (rearranging conditions)
*/
def apply(n: Node): Boolean =
//- ? check children
diff --git a/src/library/scala/xml/dtd/ExternalID.scala b/src/library/scala/xml/dtd/ExternalID.scala
index 784273083a..b0d311e54a 100644
--- a/src/library/scala/xml/dtd/ExternalID.scala
+++ b/src/library/scala/xml/dtd/ExternalID.scala
@@ -8,14 +8,9 @@
// $Id$
-
package scala.xml
package dtd
-import collection.immutable.{List, Nil}
-import collection.mutable.StringBuilder
-
-
/** an ExternalIDs - either PublicID or SystemID
*
* @author Burak Emir
diff --git a/src/library/scala/xml/dtd/Scanner.scala b/src/library/scala/xml/dtd/Scanner.scala
index 1a45a186a0..7b3e2acfe0 100644
--- a/src/library/scala/xml/dtd/Scanner.scala
+++ b/src/library/scala/xml/dtd/Scanner.scala
@@ -8,13 +8,9 @@
// $Id$
-
package scala.xml
package dtd
-import collection.{Seq, Iterator}
-import collection.immutable.{List, Nil}
-
/** Scanner for regexps (content models in DTD element declarations)
* todo: cleanup
*/
diff --git a/src/library/scala/xml/factory/Binder.scala b/src/library/scala/xml/factory/Binder.scala
index caad009b9b..3996ef2d36 100644
--- a/src/library/scala/xml/factory/Binder.scala
+++ b/src/library/scala/xml/factory/Binder.scala
@@ -12,7 +12,7 @@
package scala.xml
package factory
-import scala.xml.parsing.ValidatingMarkupHandler
+import parsing.ValidatingMarkupHandler
/**
* @author Burak Emir
diff --git a/src/library/scala/xml/factory/NodeFactory.scala b/src/library/scala/xml/factory/NodeFactory.scala
index 45dac6ccda..2dd52242db 100644
--- a/src/library/scala/xml/factory/NodeFactory.scala
+++ b/src/library/scala/xml/factory/NodeFactory.scala
@@ -12,11 +12,7 @@ package scala.xml
package factory
import parsing.{ FactoryAdapter, NoBindingFactoryAdapter }
-import collection.Seq
-import collection.immutable.{List, Nil}
-import org.xml.sax.InputSource
import java.io.{ InputStream, Reader, StringReader, File, FileDescriptor, FileInputStream }
-import javax.xml.parsers.{ SAXParser, SAXParserFactory }
trait NodeFactory[A <: Node]
{
diff --git a/src/library/scala/xml/factory/XMLLoader.scala b/src/library/scala/xml/factory/XMLLoader.scala
index a1bca21b40..8bb0cf4188 100644
--- a/src/library/scala/xml/factory/XMLLoader.scala
+++ b/src/library/scala/xml/factory/XMLLoader.scala
@@ -11,11 +11,9 @@
package scala.xml
package factory
+import javax.xml.parsers.SAXParserFactory
import parsing.{ FactoryAdapter, NoBindingFactoryAdapter }
-import org.xml.sax.InputSource
import java.io.{ InputStream, Reader, StringReader, File, FileDescriptor, FileInputStream }
-import javax.xml.parsers.{ SAXParser, SAXParserFactory }
-import java.net.URL
/** Presents collection of XML loading methods which use the parser
* created by "def parser".
diff --git a/src/library/scala/xml/include/XIncludeException.scala b/src/library/scala/xml/include/XIncludeException.scala
index 26c66f9b1d..a671f32dca 100644
--- a/src/library/scala/xml/include/XIncludeException.scala
+++ b/src/library/scala/xml/include/XIncludeException.scala
@@ -43,7 +43,7 @@ class XIncludeException(message: String) extends Exception(message) {
* This method allows you to store the original exception.
*
* @param nestedException the underlying exception which
- caused the XIncludeException to be thrown
+ * caused the XIncludeException to be thrown
*/
def setRootCause(nestedException: Throwable ) {
this.rootCause = nestedException
diff --git a/src/library/scala/xml/include/sax/Main.scala b/src/library/scala/xml/include/sax/Main.scala
index e7e986e0f8..60031b4b6a 100644
--- a/src/library/scala/xml/include/sax/Main.scala
+++ b/src/library/scala/xml/include/sax/Main.scala
@@ -13,11 +13,10 @@ package include.sax
import scala.xml.include._
import scala.util.control.Exception.{ catching, ignoring }
-import org.xml.sax.{ SAXException, SAXParseException, EntityResolver, XMLReader }
+import org.xml.sax.XMLReader
import org.xml.sax.helpers.XMLReaderFactory
object Main {
- private val xercesClass = "org.apache.xerces.parsers.SAXParser"
private val namespacePrefixes = "http://xml.org/sax/features/namespace-prefixes"
private val lexicalHandler = "http://xml.org/sax/properties/lexical-handler"
@@ -27,7 +26,7 @@ object Main {
* </p>
*
* @param args contains the URLs and/or filenames
- * of the documents to be procesed.
+ * of the documents to be processed.
*/
def main(args: Array[String]) {
def saxe[T](body: => T) = catching[T](classOf[SAXException]) opt body
@@ -35,7 +34,7 @@ object Main {
val parser: XMLReader =
saxe[XMLReader](XMLReaderFactory.createXMLReader()) getOrElse (
- saxe[XMLReader](XMLReaderFactory.createXMLReader(xercesClass)) getOrElse (
+ saxe[XMLReader](XMLReaderFactory.createXMLReader(XercesClassName)) getOrElse (
return error("Could not find an XML parser")
)
)
diff --git a/src/library/scala/xml/include/sax/XIncludeFilter.scala b/src/library/scala/xml/include/sax/XIncludeFilter.scala
index b469086e73..6e64fa9aa5 100644
--- a/src/library/scala/xml/include/sax/XIncludeFilter.scala
+++ b/src/library/scala/xml/include/sax/XIncludeFilter.scala
@@ -12,11 +12,10 @@ package scala.xml
package include.sax
import scala.xml.include._
-import org.xml.sax.{ Attributes, SAXException, XMLReader, EntityResolver, Locator }
+import org.xml.sax.{ Attributes, XMLReader, Locator }
import org.xml.sax.helpers.{ XMLReaderFactory, XMLFilterImpl, NamespaceSupport, AttributesImpl }
-import java.net.{ URL, URLConnection, MalformedURLException }
-import java.io.{ UnsupportedEncodingException, IOException, InputStream, BufferedInputStream, InputStreamReader }
+import java.io.{ InputStream, BufferedInputStream, InputStreamReader }
import java.util.Stack
/**
@@ -351,61 +350,49 @@ class XIncludeFilter extends XMLFilterImpl {
be downloaded from the specified URL.
*/
private def includeXMLDocument(url: String) {
- var source: URL = null
- try {
- val base = bases.peek().asInstanceOf[URL]
- source = new URL(base, url)
- }
- catch {
- case e:MalformedURLException =>
- val ex = new UnavailableResourceException("Unresolvable URL " + url
- + getLocation());
- ex.setRootCause(e)
- throw new SAXException("Unresolvable URL " + url + getLocation(), ex)
- }
+ val source =
+ try new URL(bases.peek(), url)
+ catch {
+ case e: MalformedURLException =>
+ val ex = new UnavailableResourceException("Unresolvable URL " + url + getLocation())
+ ex setRootCause e
+ throw new SAXException("Unresolvable URL " + url + getLocation(), ex)
+ }
try {
- // make this more robust
- var parser: XMLReader = null
- try {
- parser = XMLReaderFactory.createXMLReader()
- } catch {
- case e:SAXException =>
- try {
- parser = XMLReaderFactory.createXMLReader(
- "org.apache.xerces.parsers.SAXParser"
- );
- } catch {
- case e2: SAXException =>
- System.err.println("Could not find an XML parser")
- }
- }
- if(parser != null) {
- parser.setContentHandler(this)
- val resolver = this.getEntityResolver()
- if (resolver != null) parser.setEntityResolver(resolver);
- // save old level and base
- val previousLevel = level
- this.level = 0
- if (bases.contains(source)) {
- val e = new CircularIncludeException(
- "Circular XInclude Reference to " + source + getLocation()
- );
- throw new SAXException("Circular XInclude Reference", e)
+ val parser: XMLReader =
+ try XMLReaderFactory.createXMLReader()
+ catch {
+ case e: SAXException =>
+ try XMLReaderFactory.createXMLReader(XercesClassName)
+ catch { case _: SAXException => return System.err.println("Could not find an XML parser") }
}
- bases.push(source)
- atRoot = true
- parser.parse(source.toExternalForm())
- // restore old level and base
- this.level = previousLevel
- bases.pop()
- }
+
+ parser setContentHandler this
+ val resolver = this.getEntityResolver()
+ if (resolver != null)
+ parser setEntityResolver resolver
+
+ // save old level and base
+ val previousLevel = level
+ this.level = 0
+ if (bases contains source)
+ throw new SAXException(
+ "Circular XInclude Reference",
+ new CircularIncludeException("Circular XInclude Reference to " + source + getLocation())
+ )
+
+ bases push source
+ atRoot = true
+ parser parse source.toExternalForm()
+
+ // restore old level and base
+ this.level = previousLevel
+ bases.pop()
}
catch {
- case e:IOException =>
- throw new SAXException("Document not found: "
- + source.toExternalForm() + getLocation(), e)
+ case e: IOException =>
+ throw new SAXException("Document not found: " + source.toExternalForm() + getLocation(), e)
}
-
}
}
diff --git a/src/library/scala/xml/include/sax/XIncluder.scala b/src/library/scala/xml/include/sax/XIncluder.scala
index 3417dd78f0..bd9da10c59 100644
--- a/src/library/scala/xml/include/sax/XIncluder.scala
+++ b/src/library/scala/xml/include/sax/XIncluder.scala
@@ -10,22 +10,13 @@
package scala.xml
package include.sax
+
import scala.xml.include._
+import collection.mutable.Stack
-import org.xml.sax.SAXException
-import org.xml.sax.SAXParseException
-import org.xml.sax.ContentHandler
-import org.xml.sax.EntityResolver
-import org.xml.sax.helpers.XMLReaderFactory
-import org.xml.sax.XMLReader
-import org.xml.sax.Locator
-import org.xml.sax.Attributes
+import org.xml.sax.{ ContentHandler, XMLReader, Locator, Attributes }
import org.xml.sax.ext.LexicalHandler
-
-import java.io.{File, IOException, OutputStream, OutputStreamWriter,
- UnsupportedEncodingException, Writer}
-import java.net.{MalformedURLException, URL}
-import java.util.Stack
+import java.io.{ File, OutputStream, OutputStreamWriter, Writer }
/** XIncluder is a SAX <code>ContentHandler</code>
* that writes its XML document onto an output stream after resolving
@@ -35,8 +26,7 @@ import java.util.Stack
* based on Eliotte Rusty Harold's SAXXIncluder
* </p>
*/
-class XIncluder(outs:OutputStream, encoding:String) extends Object
-with ContentHandler with LexicalHandler {
+class XIncluder(outs: OutputStream, encoding: String) extends ContentHandler with LexicalHandler {
var out = new OutputStreamWriter(outs, encoding)
@@ -153,7 +143,7 @@ with ContentHandler with LexicalHandler {
def startDTD(name: String, publicID: String, systemID: String) {
inDTD = true
// if this is the source document, output a DOCTYPE declaration
- if (entities.size() == 0) {
+ if (entities.isEmpty) {
var id = ""
if (publicID != null) id = " PUBLIC \"" + publicID + "\" \"" + systemID + '"';
else if (systemID != null) id = " SYSTEM \"" + systemID + '"';
@@ -169,7 +159,7 @@ with ContentHandler with LexicalHandler {
def endDTD() {}
def startEntity(name: String) {
- entities.push(name)
+ entities push name
}
def endEntity(name: String) {
diff --git a/src/library/scala/xml/package.scala b/src/library/scala/xml/package.scala
new file mode 100644
index 0000000000..33639ed978
--- /dev/null
+++ b/src/library/scala/xml/package.scala
@@ -0,0 +1,18 @@
+package scala
+
+package object xml {
+ val XercesClassName = "org.apache.xerces.parsers.SAXParser"
+
+ type SAXException = org.xml.sax.SAXException
+ type SAXParseException = org.xml.sax.SAXParseException
+ type EntityResolver = org.xml.sax.EntityResolver
+ type InputSource = org.xml.sax.InputSource
+
+ type SAXParser = javax.xml.parsers.SAXParser
+
+ type IOException = java.io.IOException
+ type UnsupportedEncodingException = java.io.UnsupportedEncodingException
+
+ type URL = java.net.URL
+ type MalformedURLException = java.net.MalformedURLException
+} \ No newline at end of file
diff --git a/src/library/scala/xml/parsing/ConstructingParser.scala b/src/library/scala/xml/parsing/ConstructingParser.scala
index f029fc745a..00f195e9fd 100644
--- a/src/library/scala/xml/parsing/ConstructingParser.scala
+++ b/src/library/scala/xml/parsing/ConstructingParser.scala
@@ -25,28 +25,27 @@ object ConstructingParser {
}
/** An xml parser. parses XML and invokes callback methods of a MarkupHandler.
- * Don't forget to call next.ch on a freshly instantiated parser in order to
- * initialize it. If you get the parser from the object method, initialization
- * is already done for you.
- *
- *<pre>
-object parseFromURL {
- def main(args:Array[String]): Unit = {
- val url = args(0);
- val src = scala.io.Source.fromURL(url);
- val cpa = scala.xml.parsing.ConstructingParser.fromSource(src, false); // fromSource initializes automatically
- val doc = cpa.document();
-
- // let's see what it is
- val ppr = new scala.xml.PrettyPrinter(80,5);
- val ele = doc.docElem;
- Console.println("finished parsing");
- val out = ppr.format(ele);
- Console.println(out);
- }
-}
-</pre>
- */
+ * Don't forget to call next.ch on a freshly instantiated parser in order to
+ * initialize it. If you get the parser from the object method, initialization
+ * is already done for you.
+ *
+ * {{{
+ * object parseFromURL {
+ * def main(args:Array[String]): Unit = {
+ * val url = args(0);
+ * val src = scala.io.Source.fromURL(url);
+ * val cpa = scala.xml.parsing.ConstructingParser.fromSource(src, false); // fromSource initializes automatically
+ * val doc = cpa.document();
+ *
+ * // let's see what it is
+ * val ppr = new scala.xml.PrettyPrinter(80,5);
+ * val ele = doc.docElem;
+ * Console.println("finished parsing");
+ * val out = ppr.format(ele);
+ * Console.println(out);
+ * }
+ * }
+ * }}} */
class ConstructingParser(val input: Source, val preserveWS: Boolean)
extends ConstructingHandler
with ExternalSources
diff --git a/src/library/scala/xml/parsing/DefaultMarkupHandler.scala b/src/library/scala/xml/parsing/DefaultMarkupHandler.scala
index 69c59c30cf..0a8bd7c4d6 100644
--- a/src/library/scala/xml/parsing/DefaultMarkupHandler.scala
+++ b/src/library/scala/xml/parsing/DefaultMarkupHandler.scala
@@ -13,7 +13,7 @@ package scala.xml
package parsing
-/** default implemenation of markup handler always returns NodeSeq.Empty */
+/** default implementation of markup handler always returns NodeSeq.Empty */
abstract class DefaultMarkupHandler extends MarkupHandler {
def elem(pos: Int, pre: String, label: String, attrs: MetaData,
diff --git a/src/library/scala/xml/parsing/FactoryAdapter.scala b/src/library/scala/xml/parsing/FactoryAdapter.scala
index a83f9677a1..6960e05d25 100644
--- a/src/library/scala/xml/parsing/FactoryAdapter.scala
+++ b/src/library/scala/xml/parsing/FactoryAdapter.scala
@@ -12,20 +12,15 @@
package scala.xml
package parsing
-import java.io.{InputStream, Reader, File, FileDescriptor, FileInputStream}
-import collection.mutable.{Stack, StringBuilder}
-import collection.immutable.{List, Nil}
-import collection.{Seq, Iterator}
+import java.io.{ InputStream, Reader, File, FileDescriptor, FileInputStream }
+import collection.mutable.Stack
-import org.xml.sax.{ Attributes, InputSource }
+import org.xml.sax.Attributes
import org.xml.sax.helpers.DefaultHandler
-import javax.xml.parsers.{ SAXParser, SAXParserFactory }
// can be mixed into FactoryAdapter if desired
trait ConsoleErrorHandler extends DefaultHandler
{
- import org.xml.sax.SAXParseException
-
// ignore warning, crimson warns even for entity resolution!
override def warning(ex: SAXParseException): Unit = { }
override def error(ex: SAXParseException): Unit = printError("Error", ex)
diff --git a/src/library/scala/xml/parsing/FatalError.scala b/src/library/scala/xml/parsing/FatalError.scala
index 01b68f6591..73634298fa 100644
--- a/src/library/scala/xml/parsing/FatalError.scala
+++ b/src/library/scala/xml/parsing/FatalError.scala
@@ -10,7 +10,8 @@
package scala.xml
-package parsing;
+package parsing
-
-case class FatalError(msg:String) extends java.lang.RuntimeException(msg);
+/** !!! This is poorly named, but I guess it's in the API.
+ */
+case class FatalError(msg: String) extends java.lang.RuntimeException(msg)
diff --git a/src/library/scala/xml/parsing/MarkupHandler.scala b/src/library/scala/xml/parsing/MarkupHandler.scala
index a0058e8bc4..bcb0e03a07 100644
--- a/src/library/scala/xml/parsing/MarkupHandler.scala
+++ b/src/library/scala/xml/parsing/MarkupHandler.scala
@@ -12,7 +12,8 @@
package scala.xml
package parsing
-import scala.collection.mutable.{HashMap, Map}
+import collection.mutable
+import mutable.HashMap
import scala.io.Source
import scala.util.logging.Logged
import scala.xml.dtd._
@@ -32,7 +33,7 @@ abstract class MarkupHandler extends Logged
val isValidating: Boolean = false
var decls: List[Decl] = Nil
- var ent: Map[String, EntityDecl] = new HashMap[String, EntityDecl]()
+ var ent: mutable.Map[String, EntityDecl] = new HashMap[String, EntityDecl]()
def lookupElemDecl(Label: String): ElemDecl = {
for (z @ ElemDecl(Label, _) <- decls)
@@ -69,7 +70,7 @@ abstract class MarkupHandler extends Logged
*/
def elemEnd(pos: Int, pre: String, label: String): Unit = ()
- /** callback method invoked by MarkupParser after parsing an elementm,
+ /** callback method invoked by MarkupParser after parsing an element,
* between the elemStart and elemEnd callbacks
*
* @param pos the position in the source file
diff --git a/src/library/scala/xml/parsing/MarkupParser.scala b/src/library/scala/xml/parsing/MarkupParser.scala
index a15cd0f7e4..24e0d78c6f 100644
--- a/src/library/scala/xml/parsing/MarkupParser.scala
+++ b/src/library/scala/xml/parsing/MarkupParser.scala
@@ -32,7 +32,13 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
self: MarkupParser with MarkupHandler =>
type PositionType = Int
- type InputType = Source
+ type InputType = Source
+ type ElementType = NodeSeq
+ type AttributesType = (MetaData, NamespaceBinding)
+ type NamespaceType = NamespaceBinding
+
+ def truncatedError(msg: String): Nothing = throw FatalError(msg)
+ def errorNoEnd(tag: String) = throw FatalError("expected closing tag of " + tag)
def xHandleError(that: Char, msg: String) = reportSyntaxError(msg)
@@ -102,30 +108,28 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
md
}
- /** &lt;? prolog ::= xml S?
- * // this is a bit more lenient than necessary...
+ /** Factored out common code.
*/
- def prolog(): Tuple3[Option[String], Option[String], Option[Boolean]] = {
-
- //Console.println("(DEBUG) prolog")
- var n = 0
+ private def prologOrTextDecl(isProlog: Boolean): (Option[String], Option[String], Option[Boolean]) = {
var info_ver: Option[String] = None
var info_enc: Option[String] = None
var info_stdl: Option[Boolean] = None
var m = xmlProcInstr()
+ var n = 0
- xSpaceOpt
+ if (isProlog)
+ xSpaceOpt
m("version") match {
- case null => ;
+ case null => ;
case Text("1.0") => info_ver = Some("1.0"); n += 1
case _ => reportSyntaxError("cannot deal with versions != 1.0")
}
m("encoding") match {
case null => ;
- case Text(enc) =>
+ case Text(enc) =>
if (!isValidIANAEncoding(enc))
reportSyntaxError("\"" + enc + "\" is not a valid encoding")
else {
@@ -133,52 +137,33 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
n += 1
}
}
- m("standalone") match {
- case null => ;
- case Text("yes") => info_stdl = Some(true); n += 1
- case Text("no") => info_stdl = Some(false); n += 1
- case _ => reportSyntaxError("either 'yes' or 'no' expected")
+
+ if (isProlog) {
+ m("standalone") match {
+ case null => ;
+ case Text("yes") => info_stdl = Some(true); n += 1
+ case Text("no") => info_stdl = Some(false); n += 1
+ case _ => reportSyntaxError("either 'yes' or 'no' expected")
+ }
}
if (m.length - n != 0) {
- reportSyntaxError("VersionInfo EncodingDecl? SDDecl? or '?>' expected!");
+ val s = if (isProlog) "SDDecl? " else ""
+ reportSyntaxError("VersionInfo EncodingDecl? %sor '?>' expected!" format s)
}
- //Console.println("[MarkupParser::prolog] finished parsing prolog!");
- Tuple3(info_ver,info_enc,info_stdl)
- }
- /** prolog, but without standalone */
- def textDecl(): Tuple2[Option[String],Option[String]] = {
-
- var info_ver: Option[String] = None
- var info_enc: Option[String] = None
-
- var m = xmlProcInstr()
- var n = 0
-
- m("version") match {
- case null => ;
- case Text("1.0") => info_ver = Some("1.0"); n += 1
- case _ => reportSyntaxError("cannot deal with versions != 1.0")
- }
+ (info_ver, info_enc, info_stdl)
+ }
- m("encoding") match {
- case null => ;
- case Text(enc) =>
- if (!isValidIANAEncoding(enc))
- reportSyntaxError("\"" + enc + "\" is not a valid encoding")
- else {
- info_enc = Some(enc)
- n += 1
- }
- }
+ /** &lt;? prolog ::= xml S?
+ * // this is a bit more lenient than necessary...
+ */
+ def prolog(): (Option[String], Option[String], Option[Boolean]) =
+ prologOrTextDecl(true)
- if (m.length - n != 0) {
- reportSyntaxError("VersionInfo EncodingDecl? or '?>' expected!");
- }
- //Console.println("[MarkupParser::textDecl] finished parsing textdecl");
- Tuple2(info_ver, info_enc);
- }
+ /** prolog, but without standalone */
+ def textDecl(): (Option[String], Option[String]) =
+ prologOrTextDecl(false) match { case (x1, x2, _) => (x1, x2) }
/**
*[22] prolog ::= XMLDecl? Misc* (doctypedecl Misc*)?
@@ -190,8 +175,6 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
*/
def document(): Document = {
-
- //Console.println("(DEBUG) document")
doc = new Document()
this.dtd = null
@@ -204,7 +187,6 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
nextch // is prolog ?
var children: NodeSeq = null
if ('?' == ch) {
- //Console.println("[MarkupParser::document] starts with xml declaration");
nextch;
info_prolog = prolog()
doc.version = info_prolog._1
@@ -212,10 +194,8 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
doc.standAlone = info_prolog._3
children = content(TopScope) // DTD handled as side effect
- } else {
- //Console.println("[MarkupParser::document] does not start with xml declaration");
- //
-
+ }
+ else {
val ts = new NodeBuffer();
content1(TopScope, ts); // DTD handled as side effect
ts &+ content(TopScope);
@@ -228,7 +208,7 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
case _:ProcInstr => ;
case _:Comment => ;
case _:EntityRef => // todo: fix entities, shouldn't be "special"
- reportSyntaxError("no entity references alllowed here");
+ reportSyntaxError("no entity references allowed here");
case s:SpecialNode =>
if (s.toString().trim().length > 0) //non-empty text nodes not allowed
elemCount = elemCount + 2;
@@ -257,6 +237,14 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
this
}
+ def ch_returning_nextch = { val res = ch ; nextch ; res }
+ def mkProcInstr(position: Int, name: String, text: String): NodeSeq =
+ handle.procInstr(position, name, text)
+
+ def mkAttributes(name: String, pscope: NamespaceBinding) =
+ if (isNameStart (ch)) xAttributes(pscope)
+ else (Null, pscope)
+
/** this method assign the next character to ch and advances in input */
def nextch = {
if (curInput.hasNext) {
@@ -315,27 +303,6 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
(aMap,scope)
}
- /** attribute value, terminated by either ' or ". value may not contain &lt;.
- * AttValue ::= `'` { _ } `'`
- * | `"` { _ } `"`
- */
- def xAttributeValue(): String = {
- val endch = ch
- nextch
- while (ch != endch) {
- if ('<' == ch)
- reportSyntaxError( "'<' not allowed in attrib value" );
- putChar(ch)
- nextch
- }
- nextch
- val str = cbuf.toString()
- cbuf.length = 0
-
- // well-formedness constraint
- normalizeAttributeValue(str)
- }
-
/** entity value, terminated by either ' or ". value may not contain &lt;.
* AttValue ::= `'` { _ } `'`
* | `"` { _ } `"`
@@ -353,35 +320,6 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
str
}
-
- /** parse a start or empty tag.
- * [40] STag ::= '&lt;' Name { S Attribute } [S]
- * [44] EmptyElemTag ::= '&lt;' Name { S Attribute } [S]
- */
- protected def xTag(pscope:NamespaceBinding): (String, MetaData, NamespaceBinding) = {
- val qname = xName
-
- xSpaceOpt
- val (aMap: MetaData, scope: NamespaceBinding) = {
- if (isNameStart(ch))
- xAttributes(pscope)
- else
- (Null, pscope)
- }
- (qname, aMap, scope)
- }
-
- /** [42] '&lt;' xmlEndTag ::= '&lt;' '/' Name S? '&gt;'
- */
- def xEndTag(n: String) = {
- xToken('/')
- val m = xName
- if (n != m)
- reportSyntaxError("expected closing tag of " + n/* +", not "+m*/);
- xSpaceOpt
- xToken('>')
- }
-
/** '&lt;! CharData ::= [CDATA[ ( {char} - {char}"]]&gt;"{char} ) ']]&gt;'
*
* see [15]
@@ -392,14 +330,6 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
xTakeUntil(mkResult, () => pos, "]]>")
}
- /** CharRef ::= "&amp;#" '0'..'9' {'0'..'9'} ";"
- * | "&amp;#x" '0'..'9'|'A'..'F'|'a'..'f' { hexdigit } ";"
- *
- * see [66]
- */
- def xCharRef(ch: () => Char, nextch: () => Unit): String =
- Utility.parseCharRef(ch, nextch, reportSyntaxError _)
-
/** Comment ::= '&lt;!--' ((Char - '-') | ('-' (Char - '-')))* '--&gt;'
*
* see [15]
@@ -576,7 +506,7 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
*/
def element1(pscope: NamespaceBinding): NodeSeq = {
val pos = this.pos
- val (qname, aMap, scope) = xTag(pscope)
+ val (qname, (aMap, scope)) = xTag(pscope)
val (pre, local) = Utility.prefix(qname) match {
case Some(p) => (p, qname drop p.length+1)
case _ => (null, qname)
@@ -600,50 +530,6 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
res
}
- //def xEmbeddedExpr: MarkupType;
-
- /** Name ::= (Letter | '_' | ':') (NameChar)*
- *
- * see [5] of XML 1.0 specification
- */
- def xName: String = {
- if (isNameStart(ch)) {
- while (isNameChar(ch)) {
- putChar(ch)
- nextch
- }
- val n = cbuf.toString().intern()
- cbuf.length = 0
- n
- } else {
- reportSyntaxError("name expected")
- ""
- }
- }
-
- /** '&lt;?' ProcInstr ::= Name [S ({Char} - ({Char}'&gt;?' {Char})]'?&gt;'
- *
- * see [15]
- */
- def xProcInstr: NodeSeq = {
- val sb:StringBuilder = new StringBuilder()
- val n = xName
- if (isSpace(ch)) {
- xSpace
- while (true) {
- if (ch == '?' && { sb.append( ch ); nextch; ch == '>' }) {
- sb.length = sb.length - 1;
- nextch;
- return handle.procInstr(tmppos, n, sb.toString);
- } else
- sb.append(ch);
- nextch
- }
- };
- xToken("?>")
- handle.procInstr(tmppos, n, sb.toString)
- }
-
/** parse character data.
* precondition: xEmbeddedBlock == false (we are not in a scala block)
*/
@@ -815,8 +701,7 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
nextch
}
- /** "rec-xml/#ExtSubset" pe references may not occur within markup
- declarations
+ /** "rec-xml/#ExtSubset" pe references may not occur within markup declarations
*/
def intSubset() {
//Console.println("(DEBUG) intSubset()")
@@ -996,50 +881,4 @@ trait MarkupParser extends MarkupParserCommon with TokenTests
pos = curInput.pos
eof = false // must be false, because of places where entity refs occur
}
-
- /** for the moment, replace only character references
- * see spec 3.3.3
- * precond: cbuf empty
- */
- def normalizeAttributeValue(attval: String): String = {
- val s: Seq[Char] = attval
- val it = s.iterator
- while (it.hasNext) {
- it.next match {
- case ' '|'\t'|'\n'|'\r' =>
- cbuf.append(' ');
- case '&' => it.next match {
- case '#' =>
- var c = it.next
- val s = xCharRef ({ () => c }, { () => c = it.next })
- cbuf.append(s)
- case nchar =>
- val nbuf = new StringBuilder()
- var d = nchar
- do {
- nbuf.append(d)
- d = it.next
- } while(d != ';');
- nbuf.toString() match {
- case "lt" => cbuf.append('<')
- case "gt" => cbuf.append('>')
- case "amp" => cbuf.append('&')
- case "apos" => cbuf.append('\'')
- case "quot" => cbuf.append('"')
- case "quote" => cbuf.append('"')
- case name =>
- cbuf.append('&')
- cbuf.append(name)
- cbuf.append(';')
- }
- }
- case c =>
- cbuf.append(c)
- }
- }
- val name = cbuf.toString()
- cbuf.length = 0
- name
- }
-
}
diff --git a/src/library/scala/xml/parsing/MarkupParserCommon.scala b/src/library/scala/xml/parsing/MarkupParserCommon.scala
index 57c46c4685..936515852b 100644
--- a/src/library/scala/xml/parsing/MarkupParserCommon.scala
+++ b/src/library/scala/xml/parsing/MarkupParserCommon.scala
@@ -11,30 +11,191 @@ package parsing
import scala.io.Source
import scala.xml.dtd._
+import scala.annotation.switch
import Utility.Escapes.{ pairs => unescape }
+object MarkupParserCommon {
+ final val SU = '\u001A'
+}
+import MarkupParserCommon._
+
/** This is not a public trait - it contains common code shared
* between the library level XML parser and the compiler's.
* All members should be accessed through those.
*/
private[scala] trait MarkupParserCommon extends TokenTests {
- private final val SU: Char = 0x1A
protected def unreachable = Predef.error("Cannot be reached.")
- // type HandleType // MarkupHandler, SymbolicXMLBuilder
-
+ // type HandleType // MarkupHandler, SymbolicXMLBuilder
type InputType // Source, CharArrayReader
type PositionType // Int, Position
+ type ElementType // NodeSeq, Tree
+ type NamespaceType // NamespaceBinding, Any
+ type AttributesType // (MetaData, NamespaceBinding), mutable.Map[String, Tree]
+
+ def mkAttributes(name: String, pscope: NamespaceType): AttributesType
+ def mkProcInstr(position: PositionType, name: String, text: String): ElementType
+
+ /** parse a start or empty tag.
+ * [40] STag ::= '<' Name { S Attribute } [S]
+ * [44] EmptyElemTag ::= '<' Name { S Attribute } [S]
+ */
+ protected def xTag(pscope: NamespaceType): (String, AttributesType) = {
+ val name = xName
+ xSpaceOpt
+
+ (name, mkAttributes(name, pscope))
+ }
+
+ /** '<?' ProcInstr ::= Name [S ({Char} - ({Char}'>?' {Char})]'?>'
+ *
+ * see [15]
+ */
+ def xProcInstr: ElementType = {
+ val n = xName
+ xSpaceOpt
+ xTakeUntil(mkProcInstr(_, n, _), () => tmppos, "?>")
+ }
+
+ /** attribute value, terminated by either ' or ". value may not contain <.
+ * @param endch either ' or "
+ */
+ def xAttributeValue(endCh: Char): String = {
+ val buf = new StringBuilder
+ while (ch != endCh) {
+ // well-formedness constraint
+ if (ch == '<') return errorAndResult("'<' not allowed in attrib value", "")
+ else if (ch == SU) truncatedError("")
+ else buf append ch_returning_nextch
+ }
+ ch_returning_nextch
+ // @todo: normalize attribute value
+ buf.toString
+ }
+
+ def xAttributeValue(): String = {
+ val str = xAttributeValue(ch_returning_nextch)
+ // well-formedness constraint
+ normalizeAttributeValue(str)
+ }
+
+ private def takeUntilChar(it: Iterator[Char], end: Char): String = {
+ val buf = new StringBuilder
+ while (it.hasNext) it.next match {
+ case `end` => return buf.toString
+ case ch => buf append ch
+ }
+ error("Expected '%s'".format(end))
+ }
+
+ /** [42] '<' xmlEndTag ::= '<' '/' Name S? '>'
+ */
+ def xEndTag(startName: String) {
+ xToken('/')
+ if (xName != startName)
+ errorNoEnd(startName)
+
+ xSpaceOpt
+ xToken('>')
+ }
+
+ /** actually, Name ::= (Letter | '_' | ':') (NameChar)* but starting with ':' cannot happen
+ * Name ::= (Letter | '_') (NameChar)*
+ *
+ * see [5] of XML 1.0 specification
+ *
+ * pre-condition: ch != ':' // assured by definition of XMLSTART token
+ * post-condition: name does neither start, nor end in ':'
+ */
+ def xName: String = {
+ if (ch == SU)
+ truncatedError("")
+ else if (!isNameStart(ch))
+ return errorAndResult("name expected, but char '%s' cannot start a name" format ch, "")
+
+ val buf = new StringBuilder
+
+ do buf append ch_returning_nextch
+ while (isNameChar(ch))
+
+ if (buf.last == ':') {
+ reportSyntaxError( "name cannot end in ':'" )
+ buf.toString dropRight 1
+ }
+ else buf.toString
+ }
+
+ private def attr_unescape(s: String) = s match {
+ case "lt" => "<"
+ case "gt" => ">"
+ case "amp" => "&"
+ case "apos" => "'"
+ case "quot" => "\""
+ case "quote" => "\""
+ case _ => "&" + s + ";"
+ }
+
+ /** Replaces only character references right now.
+ * see spec 3.3.3
+ */
+ private def normalizeAttributeValue(attval: String): String = {
+ val buf = new StringBuilder
+ val it = attval.iterator.buffered
+
+ while (it.hasNext) buf append (it.next match {
+ case ' ' | '\t' | '\n' | '\r' => " "
+ case '&' if it.head == '#' => it.next ; xCharRef(it)
+ case '&' => attr_unescape(takeUntilChar(it, ';'))
+ case c => c
+ })
+
+ buf.toString
+ }
+
+ /** CharRef ::= "&#" '0'..'9' {'0'..'9'} ";"
+ * | "&#x" '0'..'9'|'A'..'F'|'a'..'f' { hexdigit } ";"
+ *
+ * see [66]
+ */
+ def xCharRef(ch: () => Char, nextch: () => Unit): String =
+ Utility.parseCharRef(ch, nextch, reportSyntaxError _)
+
+ def xCharRef(it: Iterator[Char]): String = {
+ var c = it.next
+ Utility.parseCharRef(() => c, () => { c = it.next }, reportSyntaxError _)
+ }
+
+ def xCharRef: String = xCharRef(() => ch, () => nextch)
/** Create a lookahead reader which does not influence the input */
def lookahead(): BufferedIterator[Char]
+ /** The library and compiler parsers had the interesting distinction of
+ * different behavior for nextch (a function for which there are a total
+ * of two plausible behaviors, so we know the design space was fully
+ * explored.) One of them returned the value of nextch before the increment
+ * and one of them the new value. So to unify code we have to at least
+ * temporarily abstract over the nextchs.
+ */
def ch: Char
def nextch: Char
+ def ch_returning_nextch: Char
+ def eof: Boolean
+
+ // def handle: HandleType
+ var tmppos: PositionType
+
def xHandleError(that: Char, msg: String): Unit
def reportSyntaxError(str: String): Unit
def reportSyntaxError(pos: Int, str: String): Unit
- def eof: Boolean
+
+ def truncatedError(msg: String): Nothing
+ def errorNoEnd(tag: String): Nothing
+
+ protected def errorAndResult[T](msg: String, x: T): T = {
+ reportSyntaxError(msg)
+ x
+ }
def xToken(that: Char) {
if (ch == that) nextch
@@ -53,9 +214,16 @@ private[scala] trait MarkupParserCommon extends TokenTests {
if (isSpace(ch)) { nextch; xSpaceOpt }
else xHandleError(ch, "whitespace expected")
- //
+ /** Apply a function and return the passed value */
def returning[T](x: T)(f: T => Unit): T = { f(x) ; x }
+ /** Execute body with a variable saved and restored after execution */
+ def saving[A,B](getter: A, setter: (A) => Unit)(body: => B): B = {
+ val saved = getter
+ try body
+ finally setter(saved)
+ }
+
/** Take characters from input stream until given String "until"
* is seen. Once seen, the accumulated characters are passed
* along with the current Position to the supplied handler function.
@@ -73,7 +241,7 @@ private[scala] trait MarkupParserCommon extends TokenTests {
if (ch == head && peek(rest))
return handler(positioner(), sb.toString)
else if (ch == SU)
- xHandleError(ch, "") // throws TruncatedXML in compiler
+ truncatedError("") // throws TruncatedXMLControl in compiler
sb append ch
nextch
diff --git a/src/library/scala/xml/parsing/NoBindingFactoryAdapter.scala b/src/library/scala/xml/parsing/NoBindingFactoryAdapter.scala
index 05d535155b..083465bc41 100644
--- a/src/library/scala/xml/parsing/NoBindingFactoryAdapter.scala
+++ b/src/library/scala/xml/parsing/NoBindingFactoryAdapter.scala
@@ -12,10 +12,6 @@ package scala.xml
package parsing
import factory.NodeFactory
-import collection.Seq
-import collection.immutable.List
-import org.xml.sax.InputSource
-import javax.xml.parsers.{ SAXParser, SAXParserFactory }
/** nobinding adaptor providing callbacks to parser to create elements.
* implements hash-consing
diff --git a/src/library/scala/xml/parsing/TokenTests.scala b/src/library/scala/xml/parsing/TokenTests.scala
index e41cff20a3..13500e8510 100644
--- a/src/library/scala/xml/parsing/TokenTests.scala
+++ b/src/library/scala/xml/parsing/TokenTests.scala
@@ -12,8 +12,6 @@
package scala.xml
package parsing
-import collection.Seq
-import collection.immutable.List
/**
* Helper functions for parsing XML fragments
*/
diff --git a/src/library/scala/xml/parsing/ValidatingMarkupHandler.scala b/src/library/scala/xml/parsing/ValidatingMarkupHandler.scala
index 06828b7320..00126c4881 100644
--- a/src/library/scala/xml/parsing/ValidatingMarkupHandler.scala
+++ b/src/library/scala/xml/parsing/ValidatingMarkupHandler.scala
@@ -51,7 +51,7 @@ abstract class ValidatingMarkupHandler extends MarkupHandler with Logged {
log("advanceDFA(trans): " + trans)
trans.get(ContentModel.ElemName(label)) match {
case Some(qNew) => qCurrent = qNew
- case _ => reportValidationError(pos, "DTD says, wrong element, expected one of "+trans.keysIterator);
+ case _ => reportValidationError(pos, "DTD says, wrong element, expected one of "+trans.keys);
}
}
// advance in current automaton
diff --git a/src/library/scala/xml/parsing/XhtmlEntities.scala b/src/library/scala/xml/parsing/XhtmlEntities.scala
index dbc2ae0621..6e35aa9606 100644
--- a/src/library/scala/xml/parsing/XhtmlEntities.scala
+++ b/src/library/scala/xml/parsing/XhtmlEntities.scala
@@ -8,11 +8,10 @@
// $Id$
-
package scala.xml
package parsing
-import scala.xml.dtd.{IntDef, ParsedEntityDecl}
+import scala.xml.dtd.{ IntDef, ParsedEntityDecl }
/** <p>
* (c) David Pollak 2007 WorldWide Conferencing, LLC.