The project seeks to find out what targets of new data-surveillance algorithms would look and sound like. It is based on research into software designed to identify suspicious behaviors by analyzing communications patterns. Sinister populates IRC-chat servers with friendly chat bots that behave innocently but will occasionally veer off into the conspirative. Users can interact with the bots through a “Friendster”-type environment and by phone using access numbers in various countries. Diagrams are drawn automatically of the chat bot’s conversation patterns. Users can annotate these pattern diagrams with interpretations. The diagramming robot can also be sent out to random chat rooms to check for new sinister social networks.
BACKGROUND
Software packages like Microsoft Office and web services like Google are used by countless people each day to make calculations, write business letters, and search the web. They are the common default software tool used for everyday computer tasks. Yet each application from the Windows desktop to the Google search algorithm has underlying objectives and includes views, which are not entirely of technological nature but are reflected in algorithms and interaction patterns that characterize the software.
Looking at software as more than a neutral tool has been the subject of many publications and discussions in the area of software art and culture: In “it looks like you are trying to write letter” (2001) the software critic and artist Matthew Fuller dissects the functionalities of Microsoft Word. At the end of the article he states “As we have seen, software is reduced too often into being simply a tool for the achievement of pre-existing neutrally-formulated tasks. Culture becomes an engineering problem” (Fuller). Alongside cultural concerns directed software, software is a medium for art production. In “Concepts, Notations, Software, Art” Florian Cramer gives a definition for software art. He says that “To discuss ‘software art' simply means to not take software for granted, but pay attention to how and by whom programs were written” (Cramer).
In the following, I will analyze a software research effort called „Discovering Hidden Groups in Communications Networks“ currently underway at Rensellaer Polytechnic Institute in Troy, New York. This example is used to illustrate the subtle bias incorporated in software as it reaches beyond desktop computing and into computer science research. The second part of this paper is a summary of my project “Sinister Social Network” a humorous software art answer to the algorithmic constructs proposed by the computer science researchers Jeffrey Baumes, Mark Goldberg, Malik Magdon-Ismail and William Wallace who are spearheading the “Hidden Groups” research effort.
Baumes, Goldberg, Magdon-Ismail and Wallace are working on software which they say can actually detect terrorist communications patterns in Internet forums and chats, without knowing the content of the conversation. This approach is different from previous surveillance systems such as the internationally sponsored Echelon project that reportedly relies mostly on filters that can sift through large amounts of text to identify suspicious keywords, phrases, email addresses etc. The software envisioned by Baumes, Goldberg, Magdon-Ismail and Wallace follows rules or algorithms predetermined by the researchers to detect “malicious” communications patterns. The analysis here is not concerned with what is said but with the behavior of the discussion participants. The following are excerpts from their paper entitled „Discovering Hidden Groups in Communication Networks “
Hypothesis 1:
"The type of hidden group. We differentiate between trusting and non-trusting (or paranoid) groups. Trusting groups allow messages among group members to be delivered by non-group members, whereas non-trusting groups do not. Trusting groups tend to be benign, while nontrusting groups are more likely to be malicious. The surprising result is that it is easier to detect non-trusting groups; such groups are undermined by their own paranoia." (Baumes, Goldberg, Magdon-Ismail, Wallace 2)
Hypothesis 2:
"Normal communications in the network are voluntary and "random" however a hidden group communicates because it has to communicate (for planning or coordination)." (Baumes et al 1)
Hypothesis 3:
"Group identification done by an algorithm, such as the one proposed in this paper, could be used as the initial step in narrowing down the vast communication network to a smaller set of groups of actors. The communications among these potential hidden groups could then be scrutinized more closely." (Baumes et al, 10)
At the end of the paper they write:
“The properties unique to the hidden group may also be modified for better results. A hidden group may not communicate at every time step, but may be connected more often than legitimate background groups. Also, a property not used in this analysis is that a hidden group is likely to be sparse (i.e. very nearly to a tree) to avoid detection. If there is a often-connected group that is sparse, it could be noted as more suspicious.” (Baumes et al 11)
In “Discovering Hidden Groups in Communications Networks”, Baumes, Goldberg, Madgdon-Ismail and Wallace model the properties of “sinister” communication in opposition to “normal” communications patterns while failing to provide examples to prove an actual existence of hidden groups. Therefore, the profile of the hidden group is based on stereotypical notions of what constitutes terrorist communication.
In the first chapter of The simulation of surveillance William Bogard calls this kind of general predetermined profile an “observation before the fact” (Bogard 27). His text discusses surveillance and simulation technologies aimed at controlling: According to Bogard we might believe that we have choices when the choice has already been made for us. Bogard does not expressly mention desktop software as we know it today, but it is not too farfetched to draw parallels between the observations and assumptions forming the basis of common computer apps and the profiles used by law enforcement agencies such as the Florida Highway Patrol described by Bogard:
The programmers of the desktop software anticipate a certain type of user interaction and model their software’s interface according to these assumptions. On the lookout for possible drug carriers, the Florida Highway patrol anticipates a certain type of offender by using profiles based on race, sex, type of car, direction of movement and number of passengers in the car. Bogard says that “profiles function as preliminaries to surveillance, means of training and selection, allowing police to scan the passing traffic more efficiently and quickly” (Bogard 27).
Interaction patterns incorporated in computer software shape users of that software by subtly coercing them to interact in a predetermined way with the computer. The profiles function in a similar way: since potential suspects are drawn only from a group that is already narrowed down according to a set of demographic characteristics, all actual offenders will consequently display these attributes.
In Bogard’s words: “the image of the typical offender initiates a series of action designed to eliminate risk – in effect a statistical artefact – rather than respond to an actual offense. […] The image is not false: it is more like a self-fulfilling prophecy – it creates the offense (Bogard 27).
The idea that the same could be true for diagnostic software like the one in development at Renssellaer Polytechnic Institute is not recognized widely, even though the complicated algorithms which form the basis of these apps have been created according to similar principles.
At the same time, the technology outlined in „Detecting Hidden Groups Communication Networks“ says a lot about wishful thinking attached to technology by the general public and the researchers alike. In the case of the Rensselaer research effort’s paper it is a diffuse notion of terrorism that is apparently lurking in the networks that needs to be countered by specially designed algorithms.
Sinister Social Network
The Sinister Social Networking Application aims to ironically question the views that lie at the foundation of surveillance software that is designed to algorithmically hunt down terrorist networks. Sinister populates IRC chat servers with humorously camouflaged conspirative bot groups that are accessible through the web as well as by phone.
On its visual surface, Sinister (www.sinister-network.com) is designed as a web service which imitates the popular Internet community “Friendster”. The friendly social networking metaphor and Friendster’s interaction pattern is just as one-sided as the above mentioned research effort because it portrays social networking as something good and friendly – which might as well be interpreted as conspirative behavior.
But Sinister is more than a mere exchange of personal data: Sinister populates chats with real villains. The chats take place on public IRC-chat servers, which are also playing host to “normal” chat groups. The “malicious” chat groups are consequently well camouflaged. Automatized scoundrels are discussing commonplace topics such as “gardening“, “real estate”, “finance and investments”. They are programmed to occasionally veer from the harmless, generic and into the criminal context while always staying on topic. The discussion in the gardening channel revolves around fertilizer and it’s many uses, the real estate channel is a place where plans regarding towers get forged and the finance and investment channel is all about Enron-style financial scheming. The chat conspiracies do not only include terrorism but also white collar crime and general conspirative activity.
While the chat sessions are progressing, the ongoing conversation is being analyzed and visualized in the form of “Social Network Diagrams” by the open source software piespy (http://www.jibble.org/piespy/). This software draws line graphs on the basis of who responds to whom in a selected IRC channel. There is a wide variance to the interpretation of the resulting communications graphs: one user might interpret a shape with five corners to be a pentagon and might further suspect conspirative discussion about defense contracts and so forth.
Users are encouraged to add their own suspicions and ideas of what malicious scheming the persons involved in this group might be up to just based on the picture, not on the content of the conversation. These observations are subsequently archived with time, date and name of the chat group.
The archiving of specially shaped communications graphs and comments regarding their sinisterness was inspired by a slide from a talk given by the Goldberg, Magdon-Ismail and Wallace at a conference in 2003 (see image on the left below). It is part of a series of images visualizing the scientific hypotheses. It is simplistic in the layout: one column named “No Hidden Group” the other labeled “Hidden Group”. One of the pictures displays a normal communications graph and the other one displays a sinister communications graph. Besides the red line connecting certain dots in the graph, there is nothing to indicate what exactly constitutes a “hidden group” and what constitutes “no hidden group” within the communications graphs. The hypotheses outlined in the paper as well as the presentation method give reason to believe that the hidden group is an algorithmic construct built by the researchers from subjective notions. It is striking, that the paper is written from a scientifically objective standpoint even though it obviously contains a high degree of subjectivity. The interpretation feature of Sinister is aimed at demystifying science by asking the users for their own humorously subjective input. The idea is to create a collection of sinister scenarios attached to communications patterns
Sinister by Phone
“Sinister by Phone” is another component of the multi-networked Sinister service suite: As a result of digitization in the telephone sector, local telephone numbers on several continents are now offered free of charge by companies hoping to win customers. Calls made to these numbers can be redirected to a single server. Sinister uses these local phone numbers as well as VOIP (Voice-over-IP) numbers to extend the sinister interaction scheme into the telephone networks and from there into “real” space, according to the motto “whatever you’re plotting – don’t do it without telling your friends all about it!”
The users of this special telephone service can listen to the discussion going on in the chat room as well as participate in the conversation via speech-to-text software. This software is programmed in such a way that it assumes the criminal context and therefore interprets all input from the point of view of this somewhat limited vocabulary. The software works in similar ways to the predictive profiling done by the Florida highway patrol. While the profiling always returns a criminal that fits the profile, the software always turns the input into the predetermined criminal context.
In the words of Sinister’s creators, the text to speech software is “specially calibrated to make detection more accurate”. This corresponds to the efficiency argument that law enforcement officials use to justify profiling.
On the Sinister website users of Sinister by Phone are encouraged to speak loud and clearly in public situations: Public spaces such as airports, train stations etc. and repeating statements if necessary.
This stretches the Sinister interaction scheme into the public space where fear and suspicion reign even more than in the networks and a simple misinterpreted telephone call can trigger a massive manhunt. The online magazine Telepolis reported last August, shortly after the July 7 London subway bombings, that in Hamburg an overheard phone conversation led a bystander to conclude that the caller was planning a terrorist attack with others. It prompted 1000 policemen to set up checkpoints in the city of Hamburg and release surveillance camera images of the suspects to the public. The people that were finally arrested did not turn out to be terrorists.
Mediating Sinister to the Public
Some of the Sinister chat groups have gotten visits from the administrators of the servers on which they converge before Sinister had even reached the beta state. While some of the administrators seem to enjoy the project, others were not amused (see image below). But overall, it seems to be clear even to the public on the IRC-channel that the content of the chat sessions is conspirative.
The converted “Friendster”-style social networking context although simple, is a way to capture the interest of people that use the Internet and services in the style of Friendster but may not have a specific interest in IRC, surveillance on the Internet or dry research publications.
Although the project is now fully functional, it is not finished yet. It needs to reach the public. The project lives on the net and it is available 24/7. It can be accessed by the audience whenever they choose. To get people to interact with Sinister, I will post invitations to join Sinister on digital culture mailing lists and I will also send press releases to publications like wired, as well as entering it into festivals and submitting it to the software repository runme.org.
Previous projects of mine have greatly benefited from this method of “going public”. Mailing list postings for example have been picked up by political blogs or found their way into journalist’s mailboxes from where they were converted into articles that introduce the project to a public that might not be as used to digital art as the readers of a digital culture mailing list where the posting originated. From previous experience I can say that the multiplier effects that mailing list postings, newspaper articles and blog postings can create are not to be underestimated. These distribution channels are essential in creating an audience for a netbased project.Additionally, being included in archives like the software repository runme.org has provided some of my previous projects with a context and presence beyond the rather short-lived but intense attention that mailing list postings, newspaper articles, blog postings and festivals generate.
The outcome of the “going public” described above might prompt me to make changes in the Sinister software. Therefore, the project will require a considerable amount of involvement in the next few months.
Works cited
Fuller, M. “It looks like you're writing a letter”. Telepolis 7 March 2001 Web site: www.heise.de/tp/r4/artikel/7/7073/1.html
Fuller, M. “Behind the Blip: Software as Culture”. Nettime mailing list 7 Jan. 2002. Web site: http://amsterdam.nettime.org/Lists-Archives/nettime-l-0201/msg00025.html
Cramer, F. (2002). “Concepts, Notations, Software, Art”. 23 March 2002.
Web site: http://userpage.fu-berlin.de/~cantsin/homepage/writings/software_art/concept_notations/concepts_notations_software_art.html
Baumes, J., Goldberg, M., Magdon-Ismail, M., & Wallace, W. “Discovering Hidden Groups in Communication Networks”. Renssellaer Politechnic Institute Troy, NY. 2nd NSF/NIJ Symposium on Intelligence and Security Informatics (ISI 04) Tuscon, AZ, 11 – 12 June 2004.
PDF: www.cs.rpi.edu/~goldberg/publications/hidden-graph.pdf
Magdon-Ismail, M., Wallace, W., & Siebecker, D. “Locating Hidden Groups in Communication Networks Using Hidden Markov Models” 2003. Slideshow.
Postscript: www.cs.rpi.edu/~magdon/talks/ISI03_talk.ps
Bogard, W. “Surveiiance, its simulation, and hypercontrol in virtual systems”. The simulation of surveillance 1996. Cambridge University Press.
Gärtner, B. (2005). “Checkpoints in Hamburg”. 28 Aug. 2005. Telepolis.
Web site: www.heise.de/tp/r4/html/result.xhtml?url=/tp/r4/artikel/20/20815/1.html