It does make me wonder what all these folks really mean when they talk about "ethnography," though. I'm still learning what it means myself, and I probably won't start to have a good answer until I spend a few years in the field doing it (and of course, my definition would surely continue to shift and evolve past that, too). The inimitable Jean Lave, while liberal and accepting in so many ways, holds a very strict definition of ethnography (as do many ethnographers). Among other things, she says that one must have a long-term immersion in the culture under study (where long-term is on the order of years, not days). None of this "rapid ethnography" that seems to be popular in HCI, and none of this substituting the term "ethnographic" for "qualitative" willy-nilly. Long amounts of time are necessary for many reasons: it takes time to really understand all of the intricacies and different points of view within a community (and such an understanding both pays respect to the lives of one's subjects and to the research process), it takes time to realize and challenge the assumptions the researcher brings to the table, and it takes time to collect enough data to start building hypotheses from the ground-up, based on observations rather than preconceived notions of what might be interesting. And there are many more reasons, too.
There's another factor Jean Lave talks about in realizing and challenging our implicit assumptions, the second point above. Ethnographers seem to traditionally require that the culture under study is sufficiently different from the anthropologist's because otherwise, important cultural influences are as invisible as water is to a fish (or as the air we breathe is to us, I suppose). While having this as a hard-and-fast rule has been questioned and stretched to an extent, most still recognize that cultural familiarity does breed many assumptions and unspoken understandings.
Speaking of which, this is one thing that makes me most uncomfortable about quantitative studies. They can be immensely powerful, summarizing more data and investigating more users than a qualitative study ever could, but naturally there's still a degree of interpretation that is often not discussed: what is interesting to focus on, what kinds of data is collected, what kinds of hypotheses are made and what assumptions are built into them. There aren't as many opportunities to "test" assumptions "in the field" when one is doing quantitative research, and it's so easy to miss what's really important or find oneself at a loss when challenged with questions of why or how a community does what it does. Here it's unclear whether having familiarity with a culture is more of an asset or liability: it can lead to the same kinds of assumptions but it can also give you insights that you couldn't get from the data alone. It's a drawback on quantitative research generally, I guess. Just one of the many reasons I'm trying to figure out how to walk the line between the two ...
Another thing that intrigues me about the talk this morning is how the speaker integrates design into the research process. It seems that many social scientists, even those doing research on technological artifacts in various ways, don't think directly about design ... even though some fora where they present their work expect it, as Paul Dourish said so well at the recent CHI conference. Others are more adept at design and system-building, and their social analyses seem to, at the very least, be lacking from the point of view of social science communities. But here's someone who seems very adept in both spaces, and that's impressive to me. I'd love to get the chance to work with this person (and also folks like Genevieve Bell and Ken Anderson at Intel ... just while I'm naming names :~)) and find out how to do the blending effectively.