Janet Cardiff Sound Art

JCB_25072011_080845_9970

Just a quick note on some material in this hard-to-find catalogue resume of Janet Cardiff‘s work.

It’s called Janet Cardiff: A Survey of Works, with George Bures Miller

Cardiff is well-known for her early-days “sound walks” where participants were given a Walkman or similar device to listen to as they walked about. Stories were told or experiences recounted in the audio track. The idea is simple, but from what I understand (never had the pleasure..) it was the story that made the experience engaging.

I first came across Cardiff’s work while doing sort of informal background research for the PDPal project where we were trying to understand interaction in the wild — away from desks and keyboard and all that.

What I find curious about her work is the way it augments reality before people even really thought about all this augmented reality stuff — but, it does not fetishize little tiny screens and orientation sensing and GPS and all that. It uses our earballs rather than our eyeballs — and somehow that makes it all much less fiddly. Although — if you look carefully at the bottom image you’ll see an image from a project in which one does use a screen — from a small DV camera which is playing a movie for you as you go along.

Janet Cardiff

Parenthetically, I think Cardiff had one of the best augmented reality projects with her telescopes. I’ve only seen this as documentation when I saw Cardiff talk in Berlin at Transmediale 08. There should be more documentation of this somewhere, but the effect was to look through the telescope and see a scene in a back alley that was the back alley — only with a suspicious set of activities being committed — perhaps a crime. The illusion was in the registration but the story was in the sequence of events that one saw, effectively the story. So much augmented reality augments nothing except coupons and crap like that. There is no compelling story in much augmented reality, but I don’t follow it closely so maybe things have changed.

JCB_25072011_080858_9971
JCB_25072011_080945_9972

Continue reading Janet Cardiff Sound Art

Quiet But Not Quiescent

Judge not the less yammer-y state of the studio blog to indicate that there is nothing worth yammering about. It’s just that the clang of steel caressing code has been going on and that in great measure, too. Some of you may have glimpsed and grinned at the fantastic electronified edition of the paper Drift Deck that we developed a couple of years ago. That’s right. We’ve added *batteries to the Drift Deck and it’s fallen into the *app well..it’s an app which is fantastic because it means the last remaining physical card editions can become properly *artisinal and the electronic battery editions can spread the sensibility of the Drift Deck concept to the rest of the world.

Release is imminent. Prepare ye iPhones. Hop expectantly from foot-to-foot. More news in a short while, including linkages to downloadables. In the meantime, check out the new Drift Deck webified “page” and the fantastic roster of hammererers that batteryified the ‘deck.

..And then — onto the next thing here. It’ll be quiet a little, but good things are baking in the kiln, rest assured.

*Willow next. The superlative friendregator for the discerning social being.
Continue reading Quiet But Not Quiescent

The Week Ending 291210

2=8.41 1=11.40

Rules, instructions, parameters? Embedded inscriptions of some nature, found on a wall in Sayulita Mexico.

Well, maybe weeknotes are from the *week ending* but posted at the *week commencing*. One advantage of being one’s own blog boss, I suppose.

It was a decidedly *quick* week for some reason — perhaps because the Laboratory’s brother was visiting these last couple and action, thinking and events seem to accelerate the time. There was plenty of discussions of stories and filmmaking, which ties nicely into what *must* happen this month: the re-making of several short (30 seconds or so) of this visual design fiction stories meant to communicate some of our principles of Trust as embodied in some props/prototypes. This proves quite creatively engaging and challenging.

There was a pleasant slaloming conversations with the curious and effervescent Natalie, discussing the Latourian design sensibilities and the ways that debates and conversations embed themselves with artefacts. It was lovely to have this chat, if only to begin trying out the various *props* that we’ve been making that are exemplars themselves of these arguments/theories/perspectives. The question remains — what is new here, as an argument? It was encouraging to here Natalie’s excitement and the geneology of this sort of thinking, reaching back to here canonical Live Wire and Rich Gold’s Evocative Knowledge Objects (to which the Theory Object owes everything.)

This decanted into thoughts on a Latour essay presently at desk side.

The third connotation of the word design that seems to me so significant is that when analyzing the design of some artefact the task is unquestionably about meaning — be it symbolic, commercial, or otherwise. Design lends itself to interpretation; it is made to be interpreted in the language of signs. In design, there is alwas as the French say, un dessein, or in Italian, designo. To be sure, in its weakest form design added only superficial meaning to what was brute matter and efficiency. But as it infiltrated into more and more levels of the objets, it carried with it a new attention to meaning.

[[A Cautious Prometheus? A Few Steps Toward a Philosophy of Design (with Special Attention to Peter Sloterjdijk. Bruno Latour]]

[[And special sideways inspiration from Karen, whose present reading/thinking I seem to be accidentally following alongside.]]

And then, I was thinking about Trust in this context and this precise basis for the process of *embedding* the sensibilities and sensitivities of Trust as a design practice. More as this idea develops.

There was a round of planning for future projects at the Nokia Design Strategic Projects studio, which meant thinking about what from Trust moves forward and in which ways and by what means. Similarly, we are beginning to share Trust. And wondering — to whom and to what ends? I am intrigued by this — how do you circulate ideas and with what goals so you know — in a more actionable way — how the ideas materialize and create other goals, especially within such a byzantine organization. This, I think, is one of the larger 2010 *professional* goals, I suppose (seeing as I have not really captured what those might be yet — bit tardy on that objective — I like to have New Year’s goals rather than New Year’s resolutions) — how to communicate ideas such as these, do so without PowerPoint and do so in such a way that you snap people out of a corporate stupor, or whatever it is — and do more than just scrape a bit of paint on the battleship. Rather, help set a different course heading.

The Week Ending 220110

Sunday January 24 14:01

What one finds house hunting in Los Angeles and coming across one owned by a Hollywood set designer. Also looking at the same moment, a demure, polite and inquisitive actress vaguely recognized and thence confirmed to be the nitty Shannon from season one of Lost.

Diligent weeknotes are already eluding me. Perhaps because it was a short week last week and I wasn’t in the studio until Thursday. Nevertheless — mostly a couple of days of dusting off the desk and considering what remained to finish from the previous year and continue on into the new one.

Project Trust achieved its milestone late last year and the last couple of days last week were spent assessing it’s 2010 tributaries — where and to who does it get shared? How to distil what has been learned both in practical terms as well as in the very intriguing, curious *meta* terms such as — what did we learn about how to design in such a way as to achieve unexpected, new, perhaps innovative things? What about the friction of design that hones and reshapes and burnishes a nascent idea into a new, curious, future form that moves away from the hum-drum expected outcomes? What about the style of communication, which has moved away from PowerPoint / Keynote into visual stories? What is that and how can it be informally formalized as a new way of sharing ideas that, for the time being, while this style is still new — shock, excite and awe people into becoming fervent allies and help turn that idea into its deserved material form.

So. Decisions made, for the most part, about what prototypes find their way downstream, or up-the-ladder, or to new lands. Movies blocked and storyboarded, or at least decided upon. That was those two days last week.
Continue reading The Week Ending 220110

Slow Down

Friday January 15, 21.24.48

Friday January 15, 21.25.45

Friday January 15, 21.27.17

It’s not often we’re found in print, but this happened when the magazine Good did its “Slow Issue”. Jennifer Leonard chatted with us one morning about our perspectives on the slow movement because of our work on the Slow Messenger device and on-going collaborations with slowLab and Carolyn Strauss. There’s mention of the device and a brief interview with folks like Bruce Sterling, Esther Dyson and Jamais Cascio in the magazine and online.
Continue reading Slow Down

The Week Ending 080110

Sunday September 20, 12.53.26

Markings for repair or warnings to mitigate accidents? Seen in Seoul, South Korea.

Whilst technically still on holiday, there were some things done as usual and *holiday* is never entirely just not doing nuthin’.

There was a quick visit to the studio to begin to finish the second of two commissioned Trust devices, which is looking simultaneously quite insightful and lovely. I hope some day that this becomes a lever to torque the rudder if even ever so slightly.

Jennifer Leonard’s interviews in Good Magazine’s Slow Issue (*Perspectives on a smarter, better, and slower future*) with Esther Dyson, Jamais Cascio, Bruce Sterling, John Maeda, Alexander Rose and myself appeared online. The topic of the short discussions? “We asked some of the world’s most prominent futurists to explain why slowness might be as important to the future as speed.”

And, prompted by Rhys’ clever insights into a richer, smarter less ROI-driven vector into thinking about this whole, you know..augmented reality mishegoss, I’ve been reading a fascinating history of linear perspective that has been helping guide more meaningful thinking. (I have yet to see anything that leaps much further beyond flags showing where something is by holding up a device in front of my face, which just seems momentarily cool and ultimately not particularly consonant with all the hoopleheaded hoopla.

I’ve started The Renaissance Rediscovery of Linear Perspective, which has a number of curious insights right off the bat, particularly ones that remind us that linear perspective is only a possibility and not necessarily something to be thought of as “realistic” from a variety of perspectives. In fact, it merely makes renderings that remove experience and abstract points-of-view, something that I recently learned from Latour’s Visualisation and Cognition (which, not unsurprisingly, led me to this Edgerton book via a reference and footnote.)

Configuration A - Binocular Form Factor

A Laboratory experiment from 2006 — *Viewmaster of the Future* — using a binocular-style form factor. ((The lenses are removed in this photo.))

And, the follow-on, which I haven’t started yet is the enticingly titled The Mirror, the Window, and the Telescope: How Renaissance Linear Perspective Changed Our Vision of the Universe, which immediately caught my eye as I am drawn more to the history, imagery, rituals and *user experience* dimensions of telescopes and binoculars as affordances for, bleech..*augmented reality* than this stupid hold-a-screen-up-to-my-face crap. ((cf. this stuff below — the screen-up-to-my-face configuration — never felt as good as the second iteration of this *Viewmaster of the Future* experiments we did a few years ago.))

Continue reading The Week Ending 080110

Generative Urban Design

LAIsland_001

LAMap_014

Get the flash player here: http://www.adobe.com/flashplayer

var so = new SWFObject(“http://www.db798.com/pictobrowser.swf”, “PictoBrowser”, “500”, “430”, “8”, “#E0E0E0”); so.addVariable(“source”, “sets”); so.addVariable(“names”, “LA Generative Procedural Maps”); so.addVariable(“userName”, “nearfuturelab”); so.addVariable(“userId”, “73737423@N00”); so.addVariable(“ids”, “72157622074827670”); so.addVariable(“titles”, “on”); so.addVariable(“displayNotes”, “off”); so.addVariable(“thumbAutoHide”, “off”); so.addVariable(“imageSize”, “medium”); so.addVariable(“vAlign”, “mid”); so.addVariable(“vertOffset”, “0”); so.addVariable(“colorHexVar”, “E0E0E0”); so.addVariable(“initialScale”, “off”); so.addVariable(“bgAlpha”, “71”); so.write(“PictoBrowser090818080648”);

These images are from a series of generative, algorithmic sketches to describe what Los Angeles might look like as an “augmented reality.” Specifically, one view of the city from my point of view, where the topography and built environments height-density were a function of my presence. An ego city or something.

This is more an idea that has been stuck in my head and needed some expression. I am not at all sure what one does with this or how one uses it in any instrumental way except as a proper augmentation of the one canonical reality. A bit of a Kevin Lynch (Good City Form which I haven’t finished but am enjoying and, of course, The Image of the City) style map of presence, sketched from accumulated presence data rather than specifically what I imagine or how my brain conceives of urban space.

These are simple, early sketches to see how home made cartography might create density maps that reveal some sort of cartographic indication of where you have been, leaving blank or perhaps more obvious the places you have not been. Or a GPS that shows a fog-of-war map, or constructs routes for you based on a principle of exploration — routing you through areas that you have yet to see or explore.

To be continued, as always. Just curious.

Why do I blog this? But, besides that point, I am anxious to find alternative perspectives of the city, especially ones that are dynamic and produced from closer to the ground-up, rather than from the top-down. Using occupancy as a measure, or as the algorithmic seasoning seems like a Lynchian natural first step. Based on the amount of time spent in particular areas, my own personal maps should reflect this somehow, either by fogging out all the rest of the space, drawing the rest of the space as blank or, as in these sketches, altering the terrain height and the built environment’s density and building heights, etc. (Of course, these are not actual buildings from Los Angeles — it is all a thought, a sketch of these ideas. These are the things I have been thinking about, and other kinds of algorithms and/or mechanisms to materialize these ideas, such as Drift Decks, Apparatus, Personal Digital Pal’s etc.

Also, I thought I lost these sketches after complete, well-founded frustration with the absolute most crappiest piece of over-priced software I have ever come across in the whole world.
Continue reading Generative Urban Design

William H. Whyte Revisited: An Experiment With An Apparatus for Capturing Other Points of View

Times Square Urban Living Room from Julian Bleecker. More Apparatus Videos.

[[Update: The Apparatus was exhibited at the HABITAR show at LABoral in Gijón Spain this summer 2010.]]

A couple of months ago a colleague, Jan Chipchase, floated by my desk and handed me a book of his called “The Social Life of Small Urban Spaces” by William H. Whyte. I had no idea who this Whyte character was and I could only guess what it was about and, just by the title — I figured this would lead me down another rabbit’s hole of exploration and experimentation.

As I flipped through the pages, looking at the images of urban observations of New York City from the 1970s, I was enthralled by the technique as well as the substance of the material. Whyte and his team were capturing the intriguing, sometimes curious ways in which people adapt small corners of urban space and their habits and practices and rituals. The pace and momentum of pedestrian movement is intriguing. Without assuming anything in particular, Whyte’s work was capturing movement in a seductive way — even small scale jolts and shifts and gestures. Someone moving a chair just a small bit to indicate that he is not attempting to invade someone’s microlocal private space. You see the “fast-movers” bobbing and weaving quickly around a phalanx of slow moving tourists, window shoppers or a more elderly pedestrian.

Wonderful, intriguing stuff. Sold. Hooked. What’s the brief? Oh, what would I do? Follow footsteps and curiosities, I guess. I was curious — how can the momentum and pace and speed (or lack thereof) of the urban flows be captured, highlighted, brought into focus and revealed in such a way as to visually describe time, movement, pace, scales of speed and degrees of slowness?

There is lots to say about Whyte, I am sure. I have only begun to scratch the surface of this well-known urban sociologist, explorer, scout, observer. But, for the purposes here what happened as a result of this brief conversation with Jan was something that spread through the studio — a bout of curiosity that led to another, other project. It started simply by wondering if the observational studies that Whyte had done both in this book and in other projects could be done today? And, if so — what might they observe? What might be the questions? By what principles and assumption would small urban spaces be explored?

A copy of the films Whyte had made was secured in short order. Simple observations from ground level as well as from carefully chosen vantage points up high, above the ground. This intrigued me. There had been a project in the studio this time last year with things placed high for observational purposes (high chairs, periscopes, etc.) and it was filed away in the “lost projects” binder, so this seemed perhaps a way to revive that thinking. Over the course of a week, I made four trips to Home Depot, Simon jigged a prototype bracket on the CNC machine, and I had a retractable 36 foot pole that I imagined I was going to hang a heavy DSLR off of — it scared the bejeezus out of me and required two people to safely raise up. Too high, too floppy.

Another pole — 24 feet. Daunting but serviceable. It retracts to 8 feet, which is still quite high, but the range made it worth the embarrassment. After a brief bang around the reputation and suggestion networks, a wide field of view camera was identified and two ordered. Two cameras, secured to the pole produced a fair resolution, very wide field of view for displaced observations from a peculiar point of view. Good enough.

Penn Station Still Observation from Julian Bleecker on Vimeo.

Observation apparatus deployed at 7th Avenue main entrance to Pennsylvania Station, NYC, capturing ingress & egress flows, pedestrians waiting, deciding, waking up in the morning upon hitting the sidewalk, &c. The slow-scan mode highlights things which are not moving and therefore often discounted as to their import such as, for instance, the two peculiar characters to the far left who scarcely move (and were still there at the end of the day, around 7pm!), defensible space obstacles in the form of potted plants, people who wait for things, time to pass, people or taxi cabs, &c.

A notion interpreted and brought into focus by Rhys Newman.

Friday June 19, 16.17.17

15th Street and 5th Avenue, New York City.

Using some generative algorithms to show neutral zones of flow and highlighting areas of relatively stable inactivity. Somewhat mitigated by the windiness of the day which caused the cameras to move quite a bit.

Whyte was intrigued by the movement, flows, behaviors, but also emphasized the engaged observations — pen and paper, not just measurements and statistics. He was observing and analyzing both statistically — flows of people per time period over various widths of sidewalk, for example — as well capturing those things that one misses in abstracted data sets. In the film, his avuncular tone draws our attention to small curious practices. Things like someone to moving a chair in a public open space barely a few feet from where it was so as to indicate to a nearby fellow New Yorker that they were not intending to impose upon their public-privacy.

There was something about these sorts of couplings between the analytic data — numbers and so forth — and the observed, seen and demonstrated activities of people. Observed practices crafted into a kind of story about this subject — the social life of small urban spaces. Finding ways to observe and perhaps produce useful insights and design inspirations based on the observations seems a reasonable goal. There is only so much you can do with the books of abstracted data squirreled away some place before you have to go out in the world. Where I was most interested in exploring was somewhere “lower” than the high-level observations which produce intriguing visualizations but are many steps removed from the everyday, quotidian practices. Some empirical, rough-around-the-edges, observational data ethnography. A close cousin of the truly fascinating data visualizations we have grown to love. Perhaps close to Fabien’s notion of citizen sensors and citizen cartography.

We got plenty of guff with the Apparatus when we took it on the new Highline Park. One rather abrupt park minder — sort of behaving like an airline stewardess on a really bad day — was not pleased with the pole at all and let us know it. I had to talk to someone back at the offices of the "Friends of The Highline" via a cellphone given to me by a guy who was like a human surveillance entity. The woman on the phone explained – after listening to my perhaps overly analytic and historic description of the project, Whyte, &c. – that they do not allow tripods or, "you know..long poles" in the park.

Errr ahhh…

It was all very weird, and very un-appealing and put a cloud on what is a playful project, I think, but — *shrug*.

It’s all to be figured out. Or not. Perhaps its just observation. Scraps and visual thinking. Some notes in video. Constructed objects and anticipation of going mobile in Seoul and Helsinki and Linz and London. &c. Or some kind of exploration to suggest alternative ways of seeing the world around us. That may be closer to the point, at least now.

The post-processing stages of the activity are mostly explorations of ways in which individuals or small groups of people in movement could become their own producers of representations of what they do, in an aesthetic sense. What other sorts of systems might people-flows evoke or be reminiscent of? Weather patterns? Displacement grids? Where is there stillness in the bustle? Can the city’s flows be slowed down to evoke new considerations and new perspectives of what happens in the small urban spaces?

People themselves are often seen to be controlled in a top down fashion — even less insidious than “the man”, I think of the significant pedestrian operator — the “I want to cross” button at many busy intersections. It’s a point of contact with the city’s system of algorithmic, synchronized flows. But what about people as their own algorithms, by virtue of their occupancy of urban space? Not following specific top-down plans, but bottom up actions and movements. Not augmented reality but productions of realities. The center of what happens, displaced from the operational command center that articulates how the flows will operate.

I love these moments that countervene the system-wide control grids, which you can see if you watch carefully the raw footage from 15th Street and 5th Avenue where pedestrians spread themselves into the street, stretching the boundaries of the safety of the sidewalk in anticipation of the crossing. Or, perhaps something I love less but it is still something to note, a bicyclist turning the corner against traffic, possibly into pedestrians who may be less inclined to look from whence traffic should not be coming.

We push buttons to control the algorithms of the city, as in the buttons to control signals and so forth. Or roll our cars over induction loops – these are parameters to the algoithms of top-down controls over urban flows. Suppose we interceded more directly or suppose the geometry of the city were represented this way, as a product of the non-codified “algorithm” of movements.

What sort of world would this be? What would it look like?

Highlighting only things that are moving in the Union Square Farmers’ Market.

A cartesian grid distorted by flows around the Union Square Farmers Market.

Wednesday June 17, 15.04.24

Wednesday June 17, 14.44.17

Help thanks to Marcus Bleecker, Chris Woebken, Rhys Newman, Simon James, Jan Chipchase, Aaron Meyers, Noah Keating, Bella Chu, Duncan Burns, Andrew Gartrell, Nikolaj Bestle. And so on.

Videos live online and will accumulate over time. This is Times Square, NYC, Highline in Chelsea NYC, and a generative video done with Max/MSP Jitter

LIS302DL. A 3 Axis Accelerometer

Tuesday July 14, 13.12.21

Ooooh. Those code jockeys in the Laboratory have been mucking about in the ol’ locker room, giving each other rat-tails, chucking firecrackers in the halls and having a good horsin’ around. Smoking cigarettes and drinking cheap booze. Everything. Stink bombs in the girl’s room. Whatever. It’s a regular Lord of the Flies fest on the electronics wing of the Near Future Laboratory. And, look what we found! Some pole dancin’ hardware porn! Step right up! Don’t crowd..

Wednesday July 15, 13.04.33

In reverent honor of my friends and chums who are holding forth with Sketching in Hardware 09 in London and for the solemn sadness I have for not being able to participate this year, I hereby drop some code and hardware science up on this piece of blog with a dozen or so lines of Arduinoness meant to articulate and instrumentalize the wonderful ST Micro LIS302DL 3 axis accelerometer, delivered here via the Sparkfun breakout board. Without further ado, but with plenty of firmware nakedness, here’s the sketch..*slug* this one’s for you, Sketchers..*sob*

#include

// TWI (I2C) sketch to communicate with the LIS302DL accelerometer
// Using the Wire library (created by Nicholas Zambetti)
// http://wiring.org.co/reference/libraries/Wire/index.html
// On the Arduino board, Analog In 4 is SDA, Analog In 5 is SCL
// These correspond to pin 27 (PC4/ADC4/SDA) and pin 28 (PC5/ADC5/SCL) on the Atmega8 and Atmega168
// The Wire class handles the TWI transactions, abstracting the nitty-gritty to make
// prototyping easy.

// We've got two accelerometers connected. You configure the address of each one
// by some wiring
int address_1 = 0x1C; // SDO on the LIS302DL connected to GND makes it at I2C address 0x1C
int address_2 = 0x1D; // SDO/MISO on the LIS302DL connected to VCC makes it at I2C address 0x1D
void setup()
{
  // we'll use the serial port to spit out our data
  Serial.begin(9600);

  byte tmp;

  Wire.begin(); // join i2c bus (address optional for master)


  // Read from the "WHO_AM_I" register of the LIS302DL and see if it is at
  // the expected address.
  // If it is not, spit out a useful message on the serial port. We'll get
  // erroneous data from querying this address, though.
  Wire.beginTransmission(address_1);
  Wire.send(0x0F);
  Wire.endTransmission();
  Wire.requestFrom(address_1,1);
  while(Wire.available()) {
   tmp = Wire.receive();
   if(tmp != 0x3B) {
     Serial.print("Problem! Can't find device at address "); Serial.println(address_1, HEX);
     delay(1000);
   } else {
  // configure the device's CTRL_REG1 register to initialize it
  Wire.beginTransmission(address_1);
  Wire.send(0x20); // CTRL_REG1 (20h)
  Wire.send(B01000111); // Nominal data rate, Device in active mode, +/- 2g scale, self-tests disabled, all axis's enabled
  Wire.endTransmission();
   }
  }


  // Read from the "WHO_AM_I" register of the second LIS302DL and see if it is at
  // the expected address.
  // If it is not, spit out a useful message on the serial port. We'll get
  // erroneous data from querying this address, though.
  Wire.beginTransmission(address_2);
  Wire.send(0x0F);
  Wire.endTransmission();
  Wire.requestFrom(address_2,1);
  while(Wire.available()) {
   tmp = Wire.receive();
   if(tmp != 0x3B) {
     Serial.print("Problem! Can't find device at address "); Serial.println(address_2, HEX);
     delay(1000);
   } else {

  // configure the device's CTRL_REG1 register to initialize it
  Wire.beginTransmission(address_2);
  Wire.send(0x20); // CTRL_REG1 (20h)
  Wire.send(B01000111); // Nominal data rate, Device in active mode, +/- 2g scale, self-tests disabled, all axis's enabled
  Wire.endTransmission();
   }
  }
}

void loop()
{
  Serial.print("1:"); read_acceleration(address_1);
  Serial.print("2:"); read_acceleration(address_2);
}

void read_acceleration(int address) {
  byte z_val_l, z_val_h, x_val_l, x_val_h, y_val_l, y_val_h;
  int z_val, x_val, y_val;
  Wire.beginTransmission(address);
 // Now do a transfer reading one byte from the LIS302DL
 // This data will be the contents of register 0x29, which is OUT_X
  Wire.send(0x29);
  Wire.endTransmission();
 Wire.requestFrom(address, 1);
  while(Wire.available())
 {
   x_val = Wire.receive();
 }
 // This data will be the contents of register 0x2B which is OUT_Y
 Wire.beginTransmission(address);
 Wire.send(0x2B);
 Wire.endTransmission();
 Wire.requestFrom(address, 1);
 while(Wire.available())
 {
    y_val = Wire.receive();
 }

 // This data will be the contents of register 0x2D, which is OUT_Z
 Wire.beginTransmission(address);
 Wire.send(0x2D);
 Wire.endTransmission();
 Wire.requestFrom(address, 1);
 while(Wire.available())
 {
    z_val = Wire.receive();
 }

 // I want values that run from {-X to 0} and {0 to +X}, so a little bit math goes on here...
 if(bit_is_set(x_val, 7)) {
   x_val &= B01111111;
   x_val = -1*x_val + 128;
   x_val *= -1;
 }

if(bit_is_set(y_val, 7)) {
   y_val &= B01111111;
   y_val = 128 - y_val;
   y_val *= -1;
 }

 if(bit_is_set(z_val, 7)) {
   z_val &= B01111111;
   z_val = 128 - z_val;
   z_val *= -1;
 }

 Serial.print(x_val); Serial.print(":");Serial.print(y_val); Serial.print(":"); Serial.println(z_val);
}

Tuesday July 14, 13.09.59

Wednesday July 15, 14.37.44

What’s going on here? Well, straightforward silliness and hardware gafflin’. Two accelerometers on the I2C bus just cause. It looks like this is as many as you can have on there, unless you do some shenanigans or create a second bus with some cleverness.

The hardware spec on the device (Which. You. Should. Read.) explains that, if we’re going to talk to these things using the I2C bus we need to wire CS to the high end of the logic rail, so VCC. This tells the device that we’ll be using I2C protocol. Then we need to address the device. If we connect the MISO (master in, slave out) pin to GND, then the address will be 0x1C. If we connect the MISO pin to VCC, then the address will be 0x1D. So, MISO, in the I2C configuration, controls the least significant bit of the address. This way, without further mucking about, we can have two accelerometers on the bus, which is probably one more than most situations demand, but just in case.

If I were to connect more than two, I would probably go ahead and use the three-wire protocol and have one microcontroller pin per accelerometer dedicated for chip-select (CS). Fortunately, this device supports three-wire protocols, or the SPI protocol.

Tuesday July 14, 13.17.11

The Arduino code example above does some simple preambling — initializing the two devices after making sure they are there. Then it just loops forever, reading accelerometer data from each of the three axes of each one, doing a little simple bitwise arithmetic to make the data from a negative value for negative g (upside down in most situations) to positive g (right side up, in most situations). The initialization stage sets the accelerometer range — that is, the max/min values it will read — to +/- 2g. (The device will support +/- 8g according to the specifications.)

There are some cool additional features that I don’t play with, including some interrupts that can be triggered if the device falls suddenly, or if it is “clicked/tapped” or “double-clicked/double-tapped”, which is kinda cool, I guess. If you can come up with a non-gratuitous scenario. Which is probably harder than it sounds. But, even in your gratuitous-I-double-click-my-glass-of-Porto-to-signal-the-waiter-I-need-more-Porto the device will save you the hassle of trying to do this sort of interaction semantics in firmware and get you back to finishing what you were doing in the first place.

Why do I blog this? Notes on the integration of hardware to firmware to ideas. This time with a “new” accelerometer that has some pretty neat features. After this, we’ll be going to paper-pulp and line drawings for a bit folks.

Companion Species Training Game

Wednesday July 15, 14.35.42

The new-to-me Nintendo DS “Personal Trainer Walking” (heck of a name..) alongside of the Japanese language game whose name I forget and cannot read.

I found out about this Nintendo DS game from Kevin who found out about it from Russell. I literally just got it yesterday, but it’s pretty exciting to see. I can only imagine in my head out the play dynamics unfold, but I’ll be playing with it and have some more thoughts before long.

So far I enjoy the “blind” design of the pedometer part of the concept — not too much display other than this blinking light which changes color when you’ve reached your goal. Simple, direct and not a nagging taunt from a fancy LCD that shows more than you need. You focus on your activities or just being a normal human being without poking and prodding at the device all the time, checking your status in detail, etc. When you’re in the world, be in the world, I say.

Wednesday July 15, 15.47.20

Wednesday July 15, 15.14.44

This one aspect of the design is quite curious — there is an extra pedometer device for your dog! I mean, I get the idea — people walk their dogs and so this is perfect for you and your dog to get some training together. The language in the users manual / guidebook is very funny, and I’m not sure if this is deliberate or perhaps the sensibilities of a Japanese game design company? I know none of the facts and that does not matter so much to me, but maybe it’s my sensitivities to things that fold together different species into what my advisor calls “transpecies” or “companion species” — species that need each other, or play and interact together in curious ways. (cf The Companion Species Manifesto) Thus, I cracked up when I read these items in the guide:

Wednesday July 15, 15.12.07

The meter should only be used by a person or dog. It will not work properly with any other type of animal.

The meter should only be used on a dog when supervised by a person. It should be attached in a location where it is not at risk of being chewed or swallowed.

Great stuff. I’m looking forward to seeing how the DS experience works.

Downside: I’m pre-disappointed that walking is the only physical activity it seems to work with. I ride a bike and want this to count. And there are so many other sorts of physical things that won’t count, I assume.

*shrug*

Russell points out the simplicity of the synchronization ritual, which is fantastic. Point. Press. Watch your pedometer pebble appear from a pipe on the screen and become “alive” on your screen. If you’ve ever tried to synchronize ANYTHING you’ll laugh out loud, as I did. If you’ve ever designed ANYTHING that requires synchronization, take close note of the interaction ritual here. It’s fantastically playful and simple and sensical.

Some related topics: this perpetual Laboratory project, Flavonoid.