Subject: Re: URGENT: converting a bit-array to a number
From: Erik Naggum <erik@naggum.net>
Date: 07 Jun 2001 12:44:54 +0000
Newsgroups: comp.lang.lisp
Message-ID: <3200906694248226@naggum.net>

* "Hugo Veiga" <hasv@netcabo.pt>
> please, can anyone tell me how to convert a bit-array to a number without
> losing time processing it.

  This is impossible.  It will always take some time.  This is not C, where
  you can keep the underlying bit representation but change "types" by the
  way you look at them.

  However, knowing that there _are_ underlying bit representation may make
  it far easier to do this than it would have been to go through it digit
  by digit, which, by the way, can be done rather nicely with this function:

(defun bit-arrary-integer-value (bit-array)
  "Returns the bits of the bit-array as an integer as the primary value.
The secondary value is the number of bits."
  (let ((place -1))
    (values (reduce #'+ bit-array :key (lambda (digit) (ash digit (incf place))))
	    (incf place))))

  Please note that (bit <bit-array> 0) is the _least_ significant bit.
  There are no signs in bit-arrays unless you impute one, and it is not a
  good idea to do so.  The value of a bit in slot n should be (expt 2 n),
  to make for the most mathematically sound representation.  Note that the
  way a bit array is printed thus produces a _reverse_ sequence of digits
  compared to the binary integer representation.

  It is no coincidence that both bignums and bit vectors are stored with
  the least significant digit (of their respective bases) first.  While
  big-endian representations are a lot easier to deal with because we write
  our numbers in big-endian digit sequences, hardware-wise little-endian
  has serious advantages.  If you want the reverse view, you will have to
  work a lot harder to get your integer representation.  (Note that the
  simplest way to convert from big-endian to little-endian bit order at the
  byte level is via a table of byte values if you do not have hardware
  support for it.)

#+(and franz-inc (or allegro-v6.0 allegro-v6.1) little-endian)
(defun bit-array-integer-value (bit-array)
  (let* ((bits (length bit-array))
	 (bigits (ceiling bits 16))
	 (value (copy-seq bit-array)))
    ;; Turn it into a possibly denormalized bignum
    (setf (sys:memref value -16 -2 :unsigned-word) 18)
    (setf (sys:memref value -16  0 :unsigned-word) bigits)
    ;; Normalize it.  This better not be optimized away!
    (+ 0 value)))

  It should have been needless to say even in this litigous age, but this
  is for information only. use at your own risk.

> Is there any function to do that?  The thing is that i wich to make a
> boar for a game with bit-array to save space, but in the same time i wich
> to have a unique hash value for different boards, so the most natural way
> is to convert that bit-array to a number.  I searched all over the
> internet and found nothing about that function.

  You could use integeres to represent the board directly.  Use logbitp
  instead of bit to access an individual bit.  Of course, it is no longer a
  sequence if that matters.

#:Erik
-- 
  Travel is a meat thing.