summaryrefslogtreecommitdiff
path: root/src/library
diff options
context:
space:
mode:
authorAntonio Cunei <antonio.cunei@epfl.ch>2010-07-14 17:42:54 +0000
committerAntonio Cunei <antonio.cunei@epfl.ch>2010-07-14 17:42:54 +0000
commit173a8a1a06b5736678c189284a2976448e4de139 (patch)
tree292f78ecd329ad29a13d81298eb925ffeaed6bda /src/library
parent376085286486500c39d563e3b5d91befe2453f34 (diff)
downloadscala-173a8a1a06b5736678c189284a2976448e4de139.tar.gz
scala-173a8a1a06b5736678c189284a2976448e4de139.tar.bz2
scala-173a8a1a06b5736678c189284a2976448e4de139.zip
Merged revisions 22108,22114,22121,22130-22131,...
Merged revisions 22108,22114,22121,22130-22131,22137,22140-22142,22147-22149,22167-22168, 22174,22176-22177,22182,22186,22188-22189,22194-22199,22211,22215,22234, 22248-22249,22260-22261,22276 via svnmerge from https://lampsvn.epfl.ch/svn-repos/scala/scala/trunk ........ r22108 | odersky | 2010-05-31 19:23:38 +0200 (Mon, 31 May 2010) | 1 line made hashset more robust for concurrent access to reduce eclipse race conditions. ........ r22114 | prokopec | 2010-06-01 18:15:40 +0200 (Tue, 01 Jun 2010) | 1 line Fixes #3496. No review. ........ r22121 | rytz | 2010-06-02 11:57:41 +0200 (Wed, 02 Jun 2010) | 1 line some tests. no review ........ r22130 | dubochet | 2010-06-02 15:11:24 +0200 (Wed, 02 Jun 2010) | 1 line Removed unnecessary files containing code with an uncertain copyright status. ........ r22131 | dubochet | 2010-06-02 15:41:19 +0200 (Wed, 02 Jun 2010) | 1 line Added mandatory copyright notices for some libraries shipped with Scala. ........ r22137 | dubochet | 2010-06-02 16:04:52 +0200 (Wed, 02 Jun 2010) | 1 line Added more license notices for things shipped with Scala. ........ r22140 | dubochet | 2010-06-02 18:23:43 +0200 (Wed, 02 Jun 2010) | 1 line [scaladoc] Updated man page for Scaladoc 2. ........ r22141 | prokopec | 2010-06-02 18:31:56 +0200 (Wed, 02 Jun 2010) | 4 lines Partially solves the problem for #3502. review by extempore This commit reimplements filter for Streams, but does not reimplement map in StreamWithFilter. The problem is that GC can't collect instances of Streams residing on the stack if there are multiple references to the Stream (more than a single one on the stack on which a Stream method is invoked). In the case of a StreamWithFilter, being an inner class, there is always an `$outer` reference to the outer object, so there is little GC can do. Possible solution - change the return type of WithFilter to something else (in TraversableLike) to allow it to return objects that don't have to subclass TraversableLike.WithFilter, and reimplement the withFilter method in Stream to simply call `filter` method - in the case of Streams, `withFilter` has little sense in either case... ........ r22142 | prokopec | 2010-06-02 19:09:39 +0200 (Wed, 02 Jun 2010) | 1 line Fixes #3508. No review is necessary. ........ r22147 | prokopec | 2010-06-03 10:52:01 +0200 (Thu, 03 Jun 2010) | 3 lines Fixes #3511 by adding a custom StreamView. review by extempore - please advise if we should keep this or not ........ r22148 | prokopec | 2010-06-03 10:55:14 +0200 (Thu, 03 Jun 2010) | 1 line Forgot to add stream view classes for #3511. review by extempore. ........ r22149 | dubochet | 2010-06-03 11:22:57 +0200 (Thu, 03 Jun 2010) | 1 line [scaladoc] Fixed typo in Scaladoc man page (thanks Stéphane). No review. ........ r22167 | extempore | 2010-06-04 20:34:02 +0200 (Fri, 04 Jun 2010) | 4 lines Fix for init-order caused NPE in NumericRange. While I was in there ran across some tortured logic trying to accomodate the long abandoned idea of having 5 != 5L, so simplified the contains method. Closes #3518, no review. ........ r22168 | extempore | 2010-06-04 21:15:10 +0200 (Fri, 04 Jun 2010) | 3 lines Tracked down why the jvm/natives.scala fails for me and apparently not anyone else. Rebuilt libnatives.jnlib to accomodate x86-64, and it seems to pass. No review. ........ r22174 | michelou | 2010-06-05 21:25:28 +0200 (Sat, 05 Jun 2010) | 2 lines added/updated Android examples ........ r22176 | odersky | 2010-06-06 09:46:43 +0200 (Sun, 06 Jun 2010) | 2 lines Overwrote copyToArray for efficiency. ........ r22177 | odersky | 2010-06-06 09:47:21 +0200 (Sun, 06 Jun 2010) | 2 lines Fixed a typo in a use case ........ r22182 | prokopec | 2010-06-07 12:15:32 +0200 (Mon, 07 Jun 2010) | 2 lines Adding parallel collections to trunk. sabbus also edited to add parallel collections to the library jar - review by phaller ........ r22186 | extempore | 2010-06-07 23:00:46 +0200 (Mon, 07 Jun 2010) | 2 lines Made scripts wait for all non-daemon threads to exit before calling System.exit. Closes #1955, #2006, #3408. Review by community. ........ r22188 | extempore | 2010-06-08 01:43:14 +0200 (Tue, 08 Jun 2010) | 4 lines Most of the iterate implementations were calling the given function one too many times, leading to tragic failure if the function could not handle this (such as repeatedly applying tail.) Closes #3540, review by prokopec. ........ r22189 | extempore | 2010-06-08 04:15:50 +0200 (Tue, 08 Jun 2010) | 4 lines Taking another shot at negative constants as annotation arguments since r22175 didn't quite get there. I call into the constant folder with the unfolded tree at the last point before it's going to fail the compile anyway. Closes #3521, review by odersky. ........ r22194 | michelou | 2010-06-08 13:13:04 +0200 (Tue, 08 Jun 2010) | 2 lines added/updated Android examples (cnt'd) ........ r22195 | extempore | 2010-06-08 16:35:16 +0200 (Tue, 08 Jun 2010) | 2 lines Fixed a regrettable oversight which was leaving temp files stacking up in templand, and a partial fix for #3519. No review. ........ r22196 | prokopec | 2010-06-08 18:17:58 +0200 (Tue, 08 Jun 2010) | 1 line Fixes #3461. No review.p ........ r22197 | michelou | 2010-06-08 18:52:51 +0200 (Tue, 08 Jun 2010) | 2 lines fixed setenv task (Android examples) ........ r22198 | extempore | 2010-06-08 21:03:56 +0200 (Tue, 08 Jun 2010) | 3 lines Put in some long overdue soft padding around repl completion so when it pokes around the compiler in a way which surprises something, we don't lose the repl. Closes #3548, no review. ........ r22199 | prokopec | 2010-06-09 09:56:13 +0200 (Wed, 09 Jun 2010) | 5 lines Added `combine` and `split` to immutable.HashMap. Under test/benchmarks there is a `bench` script to run benchmarks - it can be invoked after running building the library. Review by rompf. ........ r22211 | prokopec | 2010-06-10 10:58:07 +0200 (Thu, 10 Jun 2010) | 1 line HashMap merge bug fixed. No review ........ r22215 | prokopec | 2010-06-10 19:47:18 +0200 (Thu, 10 Jun 2010) | 1 line Continued working on hash trie map combine - work in progress. No review yet. ........ r22234 | prokopec | 2010-06-11 17:15:55 +0200 (Fri, 11 Jun 2010) | 2 lines Further improved combine for hash tries, cutting of another 30ms (160 downto 130). Review by rompf. ........ r22248 | michelou | 2010-06-12 20:53:45 +0200 (Sat, 12 Jun 2010) | 2 lines updated build scripts (Android examples) ........ r22249 | michelou | 2010-06-12 21:30:16 +0200 (Sat, 12 Jun 2010) | 2 lines moved README file (Android examples) ........ r22260 | extempore | 2010-06-13 18:16:47 +0200 (Sun, 13 Jun 2010) | 1 line Changed groupBy to return immutable.Map. Closes #3550, review by odersky. ........ r22261 | extempore | 2010-06-13 18:17:05 +0200 (Sun, 13 Jun 2010) | 2 lines Made getters treated more like private members when debating whether to inline. Closes #3420, review by dragos. ........ r22276 | dragos | 2010-06-14 14:18:13 +0200 (Mon, 14 Jun 2010) | 1 line Closes #3420, typo in scaladoc for BigInt. No review. ........
Diffstat (limited to 'src/library')
-rw-r--r--src/library/scala/Array.scala17
-rw-r--r--src/library/scala/collection/IterableLike.scala3
-rw-r--r--src/library/scala/collection/TraversableOnce.scala2
-rw-r--r--src/library/scala/collection/generic/TraversableFactory.scala16
-rw-r--r--src/library/scala/collection/immutable/HashMap.scala220
-rw-r--r--src/library/scala/collection/immutable/NumericRange.scala45
-rw-r--r--src/library/scala/collection/immutable/Stream.scala32
-rw-r--r--src/library/scala/collection/immutable/StreamView.scala12
-rw-r--r--src/library/scala/collection/immutable/StreamViewLike.scala76
-rw-r--r--src/library/scala/collection/mutable/ArrayOps.scala8
-rw-r--r--src/library/scala/math/BigInt.scala6
11 files changed, 378 insertions, 59 deletions
diff --git a/src/library/scala/Array.scala b/src/library/scala/Array.scala
index 9712990d73..250fd09602 100644
--- a/src/library/scala/Array.scala
+++ b/src/library/scala/Array.scala
@@ -363,13 +363,18 @@ object Array extends FallbackArrayBuilding {
*/
def iterate[T: ClassManifest](start: T, len: Int)(f: T => T): Array[T] = {
val b = newBuilder[T]
- b.sizeHint(len)
- var acc = start
- var i = 0
- while (i < len) {
+
+ if (len > 0) {
+ b.sizeHint(len)
+ var acc = start
+ var i = 1
b += acc
- acc = f(acc)
- i += 1
+
+ while (i < len) {
+ acc = f(acc)
+ i += 1
+ b += acc
+ }
}
b.result
}
diff --git a/src/library/scala/collection/IterableLike.scala b/src/library/scala/collection/IterableLike.scala
index da18c712f5..538fd09c0e 100644
--- a/src/library/scala/collection/IterableLike.scala
+++ b/src/library/scala/collection/IterableLike.scala
@@ -158,7 +158,8 @@ self =>
* @param step the distance between the first elements of successive
* groups (defaults to 1)
* @return An iterator producing ${coll}s of size `size`, except the
- * last will be truncated if the elements don't divide evenly.
+ * last and the only element will be truncated if there are
+ * fewer elements than size.
*/
def sliding[B >: A](size: Int): Iterator[Repr] = sliding(size, 1)
def sliding[B >: A](size: Int, step: Int): Iterator[Repr] =
diff --git a/src/library/scala/collection/TraversableOnce.scala b/src/library/scala/collection/TraversableOnce.scala
index b6c0ce146e..de4eb6fc22 100644
--- a/src/library/scala/collection/TraversableOnce.scala
+++ b/src/library/scala/collection/TraversableOnce.scala
@@ -277,7 +277,7 @@ trait TraversableOnce[+A] {
* @tparam B the result type of the `+` operator.
* @return the sum of all elements of this $coll with respect to the `+` operator in `num`.
*
- * @usecase def sum: Int
+ * @usecase def sum: A
*
* @return the sum of all elements in this $coll of numbers of type `Int`.
* Instead of `Int`, any other type `T` with an implicit `Numeric[T]` implementation
diff --git a/src/library/scala/collection/generic/TraversableFactory.scala b/src/library/scala/collection/generic/TraversableFactory.scala
index d8541d2714..c6f5ce4dde 100644
--- a/src/library/scala/collection/generic/TraversableFactory.scala
+++ b/src/library/scala/collection/generic/TraversableFactory.scala
@@ -224,13 +224,17 @@ abstract class TraversableFactory[CC[X] <: Traversable[X] with GenericTraversabl
*/
def iterate[A](start: A, len: Int)(f: A => A): CC[A] = {
val b = newBuilder[A]
- b.sizeHint(len)
- var acc = start
- var i = 0
- while (i < len) {
+ if (len > 0) {
+ b.sizeHint(len)
+ var acc = start
+ var i = 1
b += acc
- acc = f(acc)
- i += 1
+
+ while (i < len) {
+ acc = f(acc)
+ i += 1
+ b += acc
+ }
}
b.result
}
diff --git a/src/library/scala/collection/immutable/HashMap.scala b/src/library/scala/collection/immutable/HashMap.scala
index 01ef597d24..11292bdf0c 100644
--- a/src/library/scala/collection/immutable/HashMap.scala
+++ b/src/library/scala/collection/immutable/HashMap.scala
@@ -76,9 +76,14 @@ class HashMap[A, +B] extends Map[A,B] with MapLike[A, B, HashMap[A, B]] {
protected def removed0(key: A, hash: Int, level: Int): HashMap[A, B] = this
-
protected def writeReplace(): AnyRef = new HashMap.SerializationProxy(this)
+ def split: Seq[HashMap[A, B]] = Seq(this)
+
+ def combine[B1 >: B](that: HashMap[A, B1]): HashMap[A, B1] = combine0(that, 0)
+
+ protected def combine0[B1 >: B](that: HashMap[A, B1], level: Int): HashMap[A, B1] = that
+
}
/** $factoryInfo
@@ -99,19 +104,61 @@ object HashMap extends ImmutableMapFactory[HashMap] {
// TODO: add HashMap2, HashMap3, ...
- class HashMap1[A,+B](private var key: A, private[HashMap] var hash: Int, private var value: (B @uncheckedVariance), private var kv: (A,B @uncheckedVariance)) extends HashMap[A,B] {
+ // statistics - will remove in future
+ var dives = 0
+ var colls = 0
+ var two_colls = 0
+ var two_nocolls = 0
+
+
+ class HashMap1[A,+B](private[HashMap] var key: A, private[HashMap] var hash: Int, private[HashMap] var value: (B @uncheckedVariance), private[HashMap] var kv: (A,B @uncheckedVariance)) extends HashMap[A,B] {
override def size = 1
override def get0(key: A, hash: Int, level: Int): Option[B] =
if (hash == this.hash && key == this.key) Some(value) else None
+ // override def updated0[B1 >: B](key: A, hash: Int, level: Int, value: B1, kv: (A, B1)): HashMap[A, B1] =
+ // if (hash == this.hash && key == this.key) new HashMap1(key, hash, value, kv)
+ // else {
+ // var thatindex = (hash >>> level) & 0x1f
+ // var thisindex = (this.hash >>> level) & 0x1f
+ // if (hash != this.hash) {
+ // //new HashTrieMap[A,B1](level+5, this, new HashMap1(key, hash, value, kv))
+ // val m = new HashTrieMap[A,B1](0,new Array[HashMap[A,B1]](0),0) // TODO: could save array alloc
+ // m.updated0(this.key, this.hash, level, this.value, this.kv).updated0(key, hash, level, value, kv) // TODO and it will
+ // } else {
+ // // 32-bit hash collision (rare, but not impossible)
+ // new HashMapCollision1(hash, ListMap.empty.updated(this.key,this.value).updated(key,value))
+ // }
+ // }
+
override def updated0[B1 >: B](key: A, hash: Int, level: Int, value: B1, kv: (A, B1)): HashMap[A, B1] =
if (hash == this.hash && key == this.key) new HashMap1(key, hash, value, kv)
else {
+ var thatindex = (hash >>> level) & 0x1f
+ var thisindex = (this.hash >>> level) & 0x1f
if (hash != this.hash) {
- //new HashTrieMap[A,B1](level+5, this, new HashMap1(key, hash, value, kv))
- val m = new HashTrieMap[A,B1](0,new Array[HashMap[A,B1]](0),0) // TODO: could save array alloc
- m.updated0(this.key, this.hash, level, this.value, this.kv).updated0(key, hash, level, value, kv)
+ // they have different hashes, but may collide at this level - find a level at which they don't
+ var lvl = level
+ var top: HashTrieMap[A, B1] = null
+ var prev: HashTrieMap[A, B1] = null
+ while (thisindex == thatindex) {
+ val newlevel = new HashTrieMap[A, B1](1 << thisindex, new Array[HashMap[A, B1]](1), 2)
+ if (prev ne null) prev.elems(0) = newlevel else top = newlevel
+ prev = newlevel
+ lvl += 5
+ thatindex = (hash >>> lvl) & 0x1f
+ thisindex = (this.hash >>> lvl) & 0x1f
+ }
+ val bottelems = new Array[HashMap[A,B1]](2)
+ val ind = if (thisindex < thatindex) 1 else 0
+ bottelems(1 - ind) = this
+ bottelems(ind) = new HashMap1[A, B1](key, hash, value, kv)
+ val bottom = new HashTrieMap[A,B1]((1 << thisindex) | (1 << thatindex), bottelems, 2)
+ if (prev ne null) {
+ prev.elems(0) = bottom
+ top
+ } else bottom
} else {
// 32-bit hash collision (rare, but not impossible)
new HashMapCollision1(hash, ListMap.empty.updated(this.key,this.value).updated(key,value))
@@ -124,6 +171,7 @@ object HashMap extends ImmutableMapFactory[HashMap] {
override def iterator: Iterator[(A,B)] = Iterator(ensurePair)
override def foreach[U](f: ((A, B)) => U): Unit = f(ensurePair)
private[HashMap] def ensurePair: (A,B) = if (kv ne null) kv else { kv = (key, value); kv }
+ protected override def combine0[B1 >: B](that: HashMap[A, B1], level: Int): HashMap[A, B1] = that.updated0(key, hash, level, value, kv)
}
private class HashMapCollision1[A,+B](private[HashMap] var hash: Int, var kvs: ListMap[A,B @uncheckedVariance]) extends HashMap[A,B] {
@@ -153,11 +201,21 @@ object HashMap extends ImmutableMapFactory[HashMap] {
override def iterator: Iterator[(A,B)] = kvs.iterator
override def foreach[U](f: ((A, B)) => U): Unit = kvs.foreach(f)
+ override def split: Seq[HashMap[A, B]] = {
+ val (x, y) = kvs.splitAt(kvs.size / 2)
+ def newhm(lm: ListMap[A, B @uncheckedVariance]) = new HashMapCollision1(hash, lm)
+ List(newhm(x), newhm(y))
+ }
+ protected override def combine0[B1 >: B](that: HashMap[A, B1], level: Int): HashMap[A, B1] = {
+ // this can be made more efficient by passing the entire ListMap at once
+ var m = that
+ for (p <- kvs) m = m.updated0(p._1, this.hash, level, p._2, p)
+ m
+ }
}
-
- class HashTrieMap[A,+B](private var bitmap: Int, private var elems: Array[HashMap[A,B @uncheckedVariance]],
- private var size0: Int) extends HashMap[A,B] {
+ class HashTrieMap[A,+B](private[HashMap] var bitmap: Int, private[HashMap] var elems: Array[HashMap[A,B @uncheckedVariance]],
+ private[HashMap] var size0: Int) extends HashMap[A,B] {
/*
def this (level: Int, m1: HashMap1[A,B], m2: HashMap1[A,B]) = {
this(((m1.hash >>> level) & 0x1f) | ((m2.hash >>> level) & 0x1f), {
@@ -346,6 +404,152 @@ time { mNew.iterator.foreach( p => ()) }
}
}
+ private def printBitmap(bm: Int) {
+ var i = 32
+ var b = bm
+ while (i != 0) {
+ print((b & 1) + " ")
+ b = b >>> 1
+ i -= 1
+ }
+ println
+ }
+
+ private def posOf(n: Int, bm: Int) = {
+ var left = n
+ var i = -1
+ var b = bm
+ while (left >= 0) {
+ i += 1
+ if ((b & 1) != 0) left -= 1
+ b = b >>> 1
+ }
+ i
+ }
+
+ override def split: Seq[HashMap[A, B]] = {
+ // printBitmap(bitmap)
+ // println(elems.toList)
+
+ // println("subtrees: " + Integer.bitCount(bitmap))
+ // println("will split at: " + posOf(Integer.bitCount(bitmap) / 2, bitmap))
+ val splitpoint = posOf(Integer.bitCount(bitmap) / 2, bitmap)
+ val bm1 = bitmap & (-1 << splitpoint)
+ val bm2 = bitmap & (-1 >>> (32 - splitpoint))
+ // printBitmap(bm1)
+ // printBitmap(bm2)
+ val (e1, e2) = elems.splitAt(splitpoint)
+ // println(e1.toList)
+ // println(e2.toList)
+ val hm1 = new HashTrieMap(bm1, e1, e1.foldLeft(0)(_ + _.size))
+ val hm2 = new HashTrieMap(bm2, e2, e2.foldLeft(0)(_ + _.size))
+
+ List(hm1, hm2)
+ }
+
+ protected override def combine0[B1 >: B](that: HashMap[A, B1], level: Int): HashMap[A, B1] = that match {
+ case hm: HashMap1[_, _] =>
+ this.updated0(hm.key, hm.hash, level, hm.value.asInstanceOf[B1], hm.kv)
+ case hm: HashMapCollision1[_, _] =>
+ var m: HashMap[A, B1] = this
+ for (p <- that) m = m.updated0(p._1, computeHash(p._1), level, p._2, p)
+ m
+ case hm: HashTrieMap[_, _] =>
+ val that = hm.asInstanceOf[HashTrieMap[A, B1]]
+ val thiselems = this.elems
+ val thatelems = that.elems
+ var thisbm = this.bitmap
+ var thatbm = that.bitmap
+
+ // determine the necessary size for the array
+ val subcount = Integer.bitCount(thisbm | thatbm)
+
+ // construct a new array of appropriate size
+ val combined = new Array[HashMap[A, B1 @uncheckedVariance]](subcount)
+
+ // run through both bitmaps and add elements to it
+ var i = 0
+ var thisi = 0
+ var thati = 0
+ var totalelems = 0
+ while (i < subcount) {
+ val thislsb = thisbm ^ (thisbm & (thisbm - 1))
+ val thatlsb = thatbm ^ (thatbm & (thatbm - 1))
+ // if (this.bitmap == -1660585213) { TODO remove
+ // printBitmap(thislsb)
+ // printBitmap(thatlsb)
+ // println("------------------")
+ // }
+ if (thislsb == thatlsb) {
+ // println("a collision")
+ val m = thiselems(thisi).combine0(thatelems(thati), level + 5)
+ totalelems += m.size
+ combined(i) = m
+ thisbm = thisbm & ~thislsb
+ thatbm = thatbm & ~thatlsb
+ thati += 1
+ thisi += 1
+ } else {
+ // condition below is due to 2 things:
+ // 1) no unsigned int compare on JVM
+ // 2) 0 (no lsb) should always be greater in comparison
+ // also, search for unsigned compare Scala to find Dave's solution
+ // and compare a and b defined as below:
+ val a = thislsb - 1
+ val b = thatlsb - 1
+ // ! our case indeed is more specific, but this didn't help:
+ // if ((thislsb > 0 && thislsb < thatlsb) || thatlsb == 0 || (thatlsb < 0 && thislsb != 0)) {
+ if ((a < b) ^ (a < 0) ^ (b < 0)) {
+ // println("an element from this trie")
+ val m = thiselems(thisi)
+ totalelems += m.size
+ combined(i) = m
+ thisbm = thisbm & ~thislsb
+ thisi += 1
+ } else {
+ // println("an element from that trie")
+ val m = thatelems(thati)
+ totalelems += m.size
+ combined(i) = m
+ thatbm = thatbm & ~thatlsb
+ thati += 1
+ }
+ }
+ i += 1
+ }
+
+ val res = new HashTrieMap[A, B1](this.bitmap | that.bitmap, combined, totalelems)
+ // if (!check(this, that, res)) { TODO remove
+ // printBitmap(this.bitmap)
+ // printBitmap(that.bitmap)
+ // printBitmap(res.bitmap)
+ // println(this.bitmap)
+ // System.exit(1)
+ // }
+ res
+ case empty: HashMap[_, _] => this
+ case _ => error("section supposed to be unreachable.")
+ }
+
+ }
+
+ private def check[K](x: HashMap[K, _], y: HashMap[K, _], xy: HashMap[K, _]) = { // TODO remove this debugging helper
+ var xs = Set[K]()
+ for (elem <- x) xs += elem._1
+ var ys = Set[K]()
+ for (elem <- y) ys += elem._1
+ var union = Set[K]()
+ for (elem <- xy) union += elem._1
+ if ((xs ++ ys) != union) {
+ println("Error.")
+ println(x.getClass)
+ println(y.getClass)
+ println(xs)
+ println(ys)
+ println(xs ++ ys)
+ println(union)
+ false
+ } else true
}
@serializable @SerialVersionUID(2L) private class SerializationProxy[A,B](@transient private var orig: HashMap[A, B]) {
diff --git a/src/library/scala/collection/immutable/NumericRange.scala b/src/library/scala/collection/immutable/NumericRange.scala
index db44e9ffa0..b8bd5bd20e 100644
--- a/src/library/scala/collection/immutable/NumericRange.scala
+++ b/src/library/scala/collection/immutable/NumericRange.scala
@@ -40,8 +40,8 @@ import generic._
abstract class NumericRange[T]
(val start: T, val end: T, val step: T, val isInclusive: Boolean)
(implicit num: Integral[T])
-extends IndexedSeq[T]
-{
+extends IndexedSeq[T] {
+
/** Note that NumericRange must be invariant so that constructs
* such as
*
@@ -122,18 +122,6 @@ extends IndexedSeq[T]
else start + (fromInt(idx) * step)
}
- // a well-typed contains method.
- def containsTyped(x: T): Boolean = {
- def divides(d: T, by: T) = equiv(d % by, zero)
-
- limitTest(x) || (
- if (step > zero)
- (start <= x) && (x < end) && divides(x - start, step)
- else
- (start >= x) && (x > end) && divides(start - x, step)
- )
- }
-
// Motivated by the desire for Double ranges with BigDecimal precision,
// we need some way to map a Range and get another Range. This can't be
// done in any fully general way because Ranges are not arbitrary
@@ -165,7 +153,7 @@ extends IndexedSeq[T]
if (isInclusive) NumericRange.inclusive(start, end, step)
else NumericRange(start, end, step)
- private val underlyingRange: NumericRange[T] = self
+ private lazy val underlyingRange: NumericRange[T] = self
override def foreach[U](f: A => U) { underlyingRange foreach (x => f(fm(x))) }
override def isEmpty = underlyingRange.isEmpty
override def apply(idx: Int): A = fm(underlyingRange(idx))
@@ -173,20 +161,21 @@ extends IndexedSeq[T]
}
}
- // The contains situation makes for some interesting code.
- // I am not aware of any way to avoid a cast somewhere, because
- // contains must take an Any.
+ // a well-typed contains method.
+ def containsTyped(x: T): Boolean = {
+ def divides(d: T, by: T) = equiv(d % by, zero)
+
+ limitTest(x) || (
+ if (step > zero)
+ (start <= x) && (x < end) && divides(x - start, step)
+ else
+ (start >= x) && (x > end) && divides(start - x, step)
+ )
+ }
+
override def contains(x: Any): Boolean =
- try {
- // if we don't verify that x == typedX, then a range
- // of e.g. Longs will appear to contain an Int because
- // the cast will perform the conversion. (As of this writing
- // it is anticipated that in scala 2.8, 5L != 5 although
- // this is not yet implemented.)
- val typedX = x.asInstanceOf[T]
- containsTyped(typedX) && (x == typedX)
- }
- catch { case _: ClassCastException => super.contains(x) }
+ try containsTyped(x.asInstanceOf[T])
+ catch { case _: ClassCastException => false }
override lazy val hashCode = super.hashCode()
override def equals(other: Any) = other match {
diff --git a/src/library/scala/collection/immutable/Stream.scala b/src/library/scala/collection/immutable/Stream.scala
index 7e363a7e96..5475c59809 100644
--- a/src/library/scala/collection/immutable/Stream.scala
+++ b/src/library/scala/collection/immutable/Stream.scala
@@ -15,6 +15,8 @@ import generic._
import mutable.{Builder, StringBuilder, LazyBuilder, ListBuffer}
import scala.annotation.tailrec
+
+
/** The class `Stream` implements lazy lists where elements
* are only evaluated when they are needed. Here is an example:
*
@@ -201,11 +203,14 @@ self =>
* @param p the predicate used to filter the stream.
* @return the elements of this stream satisfying <code>p</code>.
*/
- override final def filter(p: A => Boolean): Stream[A] = {
+ override def filter(p: A => Boolean): Stream[A] = {
// optimization: drop leading prefix of elems for which f returns false
- var rest = this dropWhile (!p(_))
- if (rest.isEmpty) Stream.Empty
- else new Stream.Cons(rest.head, rest.tail filter p)
+ // var rest = this dropWhile (!p(_)) - forget DRY principle - GC can't collect otherwise
+ var rest = this
+ while (!rest.isEmpty && !p(rest.head)) rest = rest.tail
+ // private utility func to avoid `this` on stack (would be needed for the lazy arg)
+ if (rest.nonEmpty) Stream.filteredTail(rest, p)
+ else Stream.Empty
}
override final def withFilter(p: A => Boolean): StreamWithFilter = new StreamWithFilter(p)
@@ -213,6 +218,7 @@ self =>
/** A lazier implementation of WithFilter than TraversableLike's.
*/
final class StreamWithFilter(p: A => Boolean) extends WithFilter(p) {
+
override def map[B, That](f: A => B)(implicit bf: CanBuildFrom[Stream[A], B, That]): That = {
def tailMap = asStream[B](tail withFilter p map f)
asThat[That](
@@ -344,6 +350,8 @@ self =>
if (n <= 0 || isEmpty) Stream.Empty
else new Stream.Cons(head, if (n == 1) Stream.empty else tail take (n-1))
+ override def splitAt(n: Int): (Stream[A], Stream[A]) = (take(n), drop(n))
+
/** A substream starting at index `from`
* and extending up to (but not including) index `until`.
*
@@ -452,9 +460,17 @@ self =>
flatten1(asTraversable(head))
}
+ override def view = new StreamView[A, Stream[A]] {
+ protected lazy val underlying = self.repr
+ override def iterator = self.iterator
+ override def length = self.length
+ override def apply(idx: Int) = self.apply(idx)
+ }
+
/** Defines the prefix of this object's <code>toString</code> representation as ``Stream''.
*/
override def stringPrefix = "Stream"
+
}
/**
@@ -601,8 +617,8 @@ object Stream extends SeqFactory[Stream] {
if (n <= 0) Empty else new Cons(elem, fill(n-1)(elem))
override def tabulate[A](n: Int)(f: Int => A): Stream[A] = {
- def loop(i: Int) =
- if (i >= n) Empty else new Cons(f(i), tabulate(i+1)(f))
+ def loop(i: Int): Stream[A] =
+ if (i >= n) Empty else new Cons(f(i), loop(i+1))
loop(0)
}
@@ -610,6 +626,10 @@ object Stream extends SeqFactory[Stream] {
if (if (step < 0) start <= end else end <= start) Empty
else new Cons(start, range(start + step, end, step))
+ private[immutable] def filteredTail[A](stream: Stream[A], p: A => Boolean) = {
+ new Stream.Cons(stream.head, stream.tail filter p)
+ }
+
/** A stream containing all elements of a given iterator, in the order they are produced.
* @param it The iterator producing the stream's elements
*/
diff --git a/src/library/scala/collection/immutable/StreamView.scala b/src/library/scala/collection/immutable/StreamView.scala
new file mode 100644
index 0000000000..9a7da3be89
--- /dev/null
+++ b/src/library/scala/collection/immutable/StreamView.scala
@@ -0,0 +1,12 @@
+package scala.collection
+package immutable
+
+
+
+import scala.collection.generic.CanBuildFrom
+
+
+
+
+
+trait StreamView[+A, +Coll] extends StreamViewLike[A, Coll, StreamView[A, Coll]]
diff --git a/src/library/scala/collection/immutable/StreamViewLike.scala b/src/library/scala/collection/immutable/StreamViewLike.scala
new file mode 100644
index 0000000000..0b76559a0b
--- /dev/null
+++ b/src/library/scala/collection/immutable/StreamViewLike.scala
@@ -0,0 +1,76 @@
+package scala.collection
+package immutable
+
+
+
+import scala.collection.generic.CanBuildFrom
+
+
+
+
+
+trait StreamViewLike[+A,
+ +Coll,
+ +This <: StreamView[A, Coll] with StreamViewLike[A, Coll, This]]
+extends SeqView[A, Coll]
+ with SeqViewLike[A, Coll, This]
+{ self =>
+
+ override def force[B >: A, That](implicit bf: CanBuildFrom[Coll, B, That]) = {
+ this.iterator.toStream.asInstanceOf[That]
+ }
+
+ trait Transformed[+B] extends StreamView[B, Coll] with super.Transformed[B]
+
+ trait Forced[B] extends Transformed[B] with super.Forced[B]
+
+ trait Sliced extends Transformed[A] with super.Sliced
+
+ trait Mapped[B] extends Transformed[B] with super.Mapped[B]
+
+ trait FlatMapped[B] extends Transformed[B] with super.FlatMapped[B]
+
+ trait Appended[B >: A] extends Transformed[B] with super.Appended[B]
+
+ trait Filtered extends Transformed[A] with super.Filtered
+
+ trait TakenWhile extends Transformed[A] with super.TakenWhile
+
+ trait DroppedWhile extends Transformed[A] with super.DroppedWhile
+
+ trait Zipped[B] extends Transformed[(A, B)] with super.Zipped[B]
+
+ trait ZippedAll[A1 >: A, B] extends Transformed[(A1, B)] with super.ZippedAll[A1, B]
+
+ trait Reversed extends Transformed[A] with super.Reversed
+
+ trait Patched[B >: A] extends Transformed[B] with super.Patched[B]
+
+ trait Prepended[B >: A] extends Transformed[B] with super.Prepended[B]
+
+ /** boilerplate */
+ protected override def newForced[B](xs: => collection.Seq[B]): Transformed[B] = new Forced[B] { val forced = xs }
+ protected override def newAppended[B >: A](that: collection.Traversable[B]): Transformed[B] = new Appended[B] { val rest = that }
+ protected override def newMapped[B](f: A => B): Transformed[B] = new Mapped[B] { val mapping = f }
+ protected override def newFlatMapped[B](f: A => collection.Traversable[B]): Transformed[B] = new FlatMapped[B] { val mapping = f }
+ protected override def newFiltered(p: A => Boolean): Transformed[A] = new Filtered { val pred = p }
+ protected override def newSliced(_from: Int, _until: Int): Transformed[A] = new Sliced { val from = _from; val until = _until }
+ protected override def newDroppedWhile(p: A => Boolean): Transformed[A] = new DroppedWhile { val pred = p }
+ protected override def newTakenWhile(p: A => Boolean): Transformed[A] = new TakenWhile { val pred = p }
+ protected override def newZipped[B](that: collection.Iterable[B]): Transformed[(A, B)] = new Zipped[B] { val other = that }
+ protected override def newZippedAll[A1 >: A, B](that: collection.Iterable[B], _thisElem: A1, _thatElem: B): Transformed[(A1, B)] = {
+ new ZippedAll[A1, B] { val other = that; val thisElem = _thisElem; val thatElem = _thatElem }
+ }
+ protected override def newReversed: Transformed[A] = new Reversed { }
+ protected override def newPatched[B >: A](_from: Int, _patch: collection.Seq[B], _replaced: Int): Transformed[B] = {
+ new Patched[B] { val from = _from; val patch = _patch; val replaced = _replaced }
+ }
+ protected override def newPrepended[B >: A](elem: B): Transformed[B] = new Prepended[B] { protected[this] val fst = elem }
+
+}
+
+
+
+
+
+
diff --git a/src/library/scala/collection/mutable/ArrayOps.scala b/src/library/scala/collection/mutable/ArrayOps.scala
index d7072c0661..00e8697b53 100644
--- a/src/library/scala/collection/mutable/ArrayOps.scala
+++ b/src/library/scala/collection/mutable/ArrayOps.scala
@@ -10,6 +10,7 @@
package scala.collection
package mutable
+import compat.Platform.arraycopy
import scala.reflect.ClassManifest
@@ -38,6 +39,13 @@ abstract class ArrayOps[T] extends ArrayLike[T, Array[T]] {
ClassManifest.fromClass(
repr.getClass.getComponentType.getComponentType.asInstanceOf[Predef.Class[U]]))
+ override def copyToArray[U >: T](xs: Array[U], start: Int, len: Int) {
+ var l = len
+ if (repr.length < l) l = repr.length
+ if (xs.length - start < l) l = xs.length - start max 0
+ Array.copy(repr, 0, xs, start, l)
+ }
+
override def toArray[U >: T : ClassManifest]: Array[U] =
if (implicitly[ClassManifest[U]].erasure eq repr.getClass.getComponentType)
repr.asInstanceOf[Array[U]]
diff --git a/src/library/scala/math/BigInt.scala b/src/library/scala/math/BigInt.scala
index 79b377bc6c..a21057c400 100644
--- a/src/library/scala/math/BigInt.scala
+++ b/src/library/scala/math/BigInt.scala
@@ -339,9 +339,9 @@ class BigInt(val bigInteger: BigInteger) extends ScalaNumber with ScalaNumericCo
def floatValue = this.bigInteger.floatValue
/** Converts this BigInt to a <tt>double</tt>.
- * if this BigInt has too great a magnitude to represent as a float,
- * it will be converted to <code>Float.NEGATIVE_INFINITY</code> or
- * <code>Float.POSITIVE_INFINITY</code> as appropriate.
+ * if this BigInt has too great a magnitude to represent as a double,
+ * it will be converted to <code>Double.NEGATIVE_INFINITY</code> or
+ * <code>Double.POSITIVE_INFINITY</code> as appropriate.
*/
def doubleValue = this.bigInteger.doubleValue