About | Remix Defined | The Book | Texts | Projects | Travels/Exhibits | Remixes/Lists| Twitter

Archive of the category 'Data Mining'

Regressive and Reflexive Mashups in Sampling Culture, 2010 Revision, by Eduardo Navas

Download a high resolution version of Diagram in PDF format

This text was originally published on June 25, 2007 in Vague Terrain Journal as a contribution to the issue titled Sample Culture. It was revised in November 2009 and subsequently published as a chapter contribution in Sonvilla-Weiss, Stefan (Ed.) Mashup Cultures, 2010, ISBN: 978-3-7091-0095-0, Springer Wien/New York published in May 2010.

It is here republished with permission from the publisher and is requested that it be cited appropriately.  This online publication is different from the print version in that it is missing images that help illustrate the theory of Remix that I propose.  I do encourage readers to consider looking at the actual publication as it offers an important collection of texts on mashups.

I would like to thank Greg J. Smith for giving me the opportunity to publish my initial ideas in Vague Terrain, and Stefan Sonvilla-Weiss for inviting me to revise them as a contribution to his book publication.

This version brings together much of my previous writing.  Individuals who have read texts such as The Bond of Repetition and Representation, as well as Turbulence: Remixes and Bonus Beats will find that many of my definitions and theories of Remix are repeated in this text.  I found this necessary to make sense of a fourth term which I introduce: the Regenerative Remix.  Those who have read the previous version of this text may like to skip pre-existing parts, and go directly to the section titled “The Regenerative Remix.”  However, all sections have been revised for clarity, so I encourage readers to at least browse through previously written material.

An important change has been made to this text.  In the original version I argued that Reflexive Mashups were not remixes.  In 2007 I did not know what Reflexive Mashups could be if they were not remixes in the traditional sense, but after consideration and rewriting, I developed the concept of the Regenerative Remix.  To learn more about this change in my definition of Remix as a form of discourse I invite readers to consider my revised argument.  I also introduce a chart (above) which helps explain how Remix moves across culture. I also include an entirely new conclusion which will clarify my earlier position on software mashups.

A note on formatting: The text below is set up in simple text form.  This means that italics and other conventions found in print publications are missing.  If you would like to read a print ready version, please download a PDF file.

———-

Introduction

During the first decade of the twenty-first century, sampling is practiced in new media culture when any software users including creative industry professionals as well as average consumers apply cut/copy & paste in diverse software applications; for professionals this could mean 3-D modeling software like Maya (used to develop animations in films like Spiderman or Lord of the Rings); [1] and for average persons it could mean Microsoft Word, often used to write texts like this one. Cut/copy & paste which is, in essence, a common form of sampling, is a vital new media feature in the development of Remix. In Web 2.0 applications cut/copy & paste is a necessary element to develop mashups; yet the cultural model of mashups is not limited to software, but spans across media.

Mashups actually have roots in sampling principles that became apparent and popular in music around the seventies with the growing popularity of music remixes in disco and hip hop culture, and even though mashups are founded on principles initially explored in music they are not straight forward remixes if we think of remixes as allegories. This is important to entertain because, at first, Remix appears to extend repetition of content and form in media in terms of mass escapism; the argument in this paper, however, is that when mashups move beyond basic remix principles, a constructive rupture develops that shows possibilities for new forms of cultural production that question standard commercial practice.

(more…)

After Media (Hot and Cold), by Eduardo Navas

Image capture, July 11, 2009, http://hulu.com

The following text was originally published during the month of August, 2009 as part of Drain‘s Cold issue.  The journal is a refereed online journal published bi-annually.  The text is republished in full on Remix Theory with permission.  Drain’s copyright agreement allows for 25% of the essay to be reblogged or reposted on other sites with proper citation and linkage to the journal at http://www.drainmag.com/.  I ask that their agreement be respected by the online community.

In 1964 Marshal McLuhan published his essay “Media Hot and Cold,” in one of his most influential books, Understanding Media.[1] The essay considers the concepts of hot and cold as metaphors to define how people before and during the sixties related to the ongoing development of media, not only in Canada and the United States but also throughout the world.[2] Since the sixties, the terms hot and cold have become constant points of reference in media studies. However, these principles, as defined by McLuhan, have changed since he first introduced them. What follows is a reflection on such changes during the development of media in 2009.

McLuhan is quick to note that media is defined according to context. His essay begins with a citation of “The Rise of the Waltz” by Curt Sachsk, which he uses to explain the social construction behind hot and cold media. He argues that the Waltz during the eighteenth century was considered hot, and that this fact might be overlooked by people who lived in the century of Jazz (McLuhan’s own time period). Even though McLuhan does not follow up on this observation, his implicit statement is that how hot and cold are perceived in the twentieth century is different from the eighteenth. Because of this implication, his essay is best read historically. This interpretation makes the reader aware of how considering a particular medium as hot or cold is a social act, informed by the politics of culture. McLuhan’s first example demonstrates that, while media may become hot or cold, or be hot at one time and cold at another, according to context, the terms, themselves, are not questioned, but rather taken as monolithic points of reference. To make sense of this point, McLuhan’s concepts must be defined.

(more…)

Networked: a (networked_book) about (networked_art) is LIVE!

Note: Here is the official launch of a collaborative project I have been part of for about two years and finally sees the light of day.  Official release follows:

PLEASE HELP US SPREAD THE WORD

WE INVITE YOU TO PARTICIPATE . comment, revise, translate, submit a chapter
http://networkedbook.org

Two years in the making, Networked: a (networked_book) about (networked_art) is now open for comments, revisions, and translations. You may also submit a chapter for consideration.

Please register and then Read | Write:

THE IMMEDIATED NOW: NETWORK CULTURE AND THE POETICS OF REALITY
Kazys Varnelis
http://varnelis.networkedbook.org

LIFETRACING: THE TRACES OF A NETWORKED LIFE
Anne Helmond
http://helmond.networkedbook.org

STORAGE IN COLLABORATIVE NETWORKED ART
Jason Freeman
http://freeman.networkedbook.org

DATA UNDERMINING: THE WORK OF NETWORKED ART IN AN AGE OF IMPERCEPTIBILITY
Anna Munster
http://munster.networkedbook.org

ART IN THE AGE OF DATAFLOW: NARRATIVE, AUTHORSHIP, AND INDETERMINACY
Patrick Lichty
http://lichty.networkedbook.org

TAGS: active, aethetics, aggregators, authenticity, authorship, BEN FRY, BEN RUBIN, BURAK ARIKAN, collaborative, communication, data, data mining, digital traces, distributed, DIY, EDUARDO NAVAS, everyday life, flow, GOLAN LEVIN, identity, improvisation, Internet, JANET CARDIFF, JASON FREEMAN, JODI.ORG, JONATHAN HARRIS, latency, lifelogging, lifetracing, MANIK, mapping, MARK HANSEN, MARTIN WATTENBERG, MAX NEUHAUS, Mechanical Turk, mediation, memory, music, narrative, NastyNets, NATHANIEL STERN, net art, network, NICK KNOUF, nonlinear, OLIVER LARIC, participation, performative, persistance, PETER TRAUB, platform, postmodernism, presentational, privacy, prosumer, prosurfer, ranking, realism, reality, real-time, relational, remix, representation, research, RYBN, SCARLET ELECTRIC, SCOTT KILDALL, search engine, self, self-exposure, SHIFTSPACE.ORG, social networks, software, sousveillance, STEVE LAMBERT, storage, surveillance, tactical media, telepresence, THE HUB, THEY RULE, TrackMeNot, transmission, TV, user-generated, visualization, web 2.0, webcam, widget, Wikipedia Art, YES MEN

BACKGROUND

“Networked” proposes that a history or critique of interactive and/or participatory art must itself be interactive and/or participatory; that the technologies used to create a work suggest new forms a “book” might take.

In 2008, Turbulence.org and its project partners — NewMediaFix, Telic Arts Exchange, and Freewaves – issued an international, open call for chapter proposals. We invited contributions that critically and creatively rethink how networked art is categorized, analyzed, legitimized — and by whom — as norms of authority, trust, authenticity and legitimacy evolve.

Our international committee consisted of: Steve Dietz (Northern Lights, MN) :: Martha Gabriel (net artist, Brazil) :: Geert Lovink (Institute for Network Cultures, The Netherlands) :: Nick Montfort (Massachusetts Institute for Technology, MA) :: Anne Bray (LA Freewaves, LA) :: Sean Dockray (Telic Arts Exchange, LA) :: Jo-Anne Green (NRPA, MA) :: Eduardo Navas (newmediaFIX) :: Helen Thorington (NRPA, NY)

Built by Matthew Belanger (our hero!), http://networkedbook.org is powered by WordPress, CommentPress and BuddyPress.

Networked was made possible with funds from the National Endowment for the Arts (United States). Thank you.

We are deeply grateful to Eduardo Navas for his commitment to both this project and past collaborations with Turbulence.org.

Jo-Anne Green and Helen Thorington
jo at turbulence dot org
newradio at turbulence dot org

“After the Blogger as Producer” by Eduardo Navas

Image source: The New Mexico Independent

Written for Interactiva Biennale 2009

The following text was written for Interactiva 09 Biennale, which takes place the month of May and June of 2009.  Other texts written for the biennale can be found at the Interactiva site.

NOTE: I have written a text in which I discuss Twitter in social activism, something which is not included in this text. Please see “After Iran’s Twitter Revolution: Egypt.”

In March of 2005 I wrote “The Blogger as Producer.”[1]  The essay proposed blogging as a potentially critical platform for the online writer.  It was written specifically with a focus on the well-known text, “The Author as Producer,” by Walter Benjamin, who viewed the critical writer active during the 1920’s and 30’s with a promising constructive position in culture. [2]

In 2005 blogging was increasing in popularity, and in my view, some of the elements entertained by Benjamin appeared to resonate in online culture.  During the first half of the twentieth century, Benjamin considered the newspaper an important cultural development that affected literature and writing because newspaper readers attained certain agency as consumers of an increasingly popular medium.  During this time period, the evaluation of letters to editors was important for newspapers to develop a consistent audience.  In 2005, it was the blogosphere that had the media’s attention.  In this time period, people who wrote their opinions on blogs could be evaluated with unprecedented efficiency. [3]
(more…)

REPOST: An Invention That Could Change the Internet For Ever

Image source: Wolframalpha

Text source: The Independent

Originally published Sunday, 3 May 200

The new system, Wolfram Alpha, showcased at Harvard University in the US last week, takes the first step towards what many consider to be the internet’s Holy Grail – a global store of information that understands and responds to ordinary language in the same way a person does.

Although the system is still new, it has already produced massive interest and excitement among technology pundits and internet watchers.

Computer experts believe the new search engine will be an evolutionary leap in the development of the internet. Nova Spivack, an internet and computer expert, said that Wolfram Alpha could prove just as important as Google. “It is really impressive and significant,” he wrote. “In fact it may be as important for the web (and the world) as Google, but for a different purpose.

(more…)

REBLOG: Could Your Social Networks Spill Your Secrets?, by Tom Simonite

Image and Text: NewScientist

Originally posted on January 7, 2009

Via Netbehaviour.org

In an article at the end of last year we looked at some of the ways data-mining techniques are being used by marketeers and security services to extract sometimes private information by assembling huge amounts of data from web visits, emails, purchases, and more.

Now researchers at Google caution in a paper (pdf) that by becoming entangled in ever more social networks online, people are building up their own piles of revealing data. And as more websites gain social features, even the things users strive to keep private won’t necessarily stay that way, they suggest. (more…)

Web 2.0 and Beyond

Image source: Flckr

(Click image to enlarge, or here)

The past, present and future of the web are presented in the above diagram. The image is peculiar for proposing a linear development towards AI. It appears that we are about to enter Web 3.0,or perhaps we may already be there. The mashup, social media sharing, and social networking, are presented as transitional elements from Web 2.0 to 3.0. However, It is not clear to me what lightweight collaboration might be.

Below are some interesting articles on the future of the web:

Dean Giustini, Web 3.0 and Medicine. British Medical Journal. December 2007

Nova Spivack, The Third-Generation Web is Coming

Resource, World Wide Mind Project

Tristan Zand, Web 3.0 back to the real world / back to our sense, June 2006

Various articles by John Markoff at the NY Times:

Entrepreneurs See a Web Guided by Common Sense
By JOHN MARKOFF. Published: November 12, 2006 Referred to as Web 3.0, the effort is in its infancy, and the very idea has given rise to skeptics who

What I Meant to Say Was Semantic Web – Bits Blog
Until web 3.0 comes around or the monitors allow our comments I would like to hear more. …. John Markoff. Reporter, San Francisco

John Markoff – Bits Blog By John Markoff. Radar Networks is introducing Twine.com, a service that uses semantic Web technology, sometimes called Web 3.0, to improve sharing

For Coors Light, a Night Out That Begins on MySpace, by Stuart Elliott

Image and text source: NYTimes

Published: May 28, 2008

BEER has long been marketed as a sociable beverage, from a campaign for Budweiser that carried the theme “When gentlemen agree” to the Löwenbräu jingle that began, “Here’s to good friends.” Now, another beer brand, Coors Light, is extending its presence in the new media with efforts on the social networking Web sites Facebook and MySpace.

To promote a new wide-mouth Coors Light can, two clips of the “perfect pour” have been posted on YouTube. New media like Facebook and MySpace have also been enlisted by Coors Brewing.
Enlarge This Image

On Facebook next week, consumers 21 and older will be able to send their friends invitations to meet for Coors Light.

Read the entire article at NYTimes

The End of Theory: The Data Deluge Makes the Scientific Method Obsolete, by Chris Anderson

Image and text source: Wired Magazine

June 23, 2008

“All models are wrong, but some are useful.”

So proclaimed statistician George Box 30 years ago, and he was right. But what choice did we have? Only models, from cosmological equations to theories of human behavior, seemed to be able to consistently, if imperfectly, explain the world around us. Until now. Today companies like Google, which have grown up in an era of massively abundant data, don’t have to settle for wrong models. Indeed, they don’t have to settle for models at all.

Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition. They are the children of the Petabyte Age.

The Petabyte Age is different because more is different. Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies.

Read the entire article at Wired Magazine

Wikipedia Questions Paths to More Money, by The Associated Press

Image Capture: Remix Theory

Text source: NYTimes
Scroll the list of the 10 most popular Web sites in the U.S., and you’ll encounter the Internet’s richest corporate players — names like Yahoo, Amazon.com, News Corp., Microsoft and Google.

Except for No. 7: Wikipedia. And there lies a delicate situation.

With 2 million articles in English alone, the Internet encyclopedia ”anyone can edit” stormed the Web’s top ranks through the work of unpaid volunteers and the assistance of donors. But that gives Wikipedia far less financial clout than its Web peers, and doing almost anything to improve that situation invites scrutiny from the same community that proudly generates the content.

And so, much as how its base of editors and bureaucrats endlessly debate touchy articles and other changes to the site, Wikipedia’s community churns with questions over how the nonprofit Wikimedia Foundation, which oversees the project, should get and spend its money.

Read the entire article at a  NYTimes

Current Projects