a text on methods
A major part of how I attribute value to a work is how much flexibility, modularity and scalability has been considered in its making. My belief is that all numerical [art]work is indeed part of a craft, and it's consequently the makers responsibility to —at least conceptually— elude to the (common / best) practices of said craft, being both backwards and forwards looking. The attibution of value in digital practices and products is complex, but I believe that the industry is now mature enough to be viewed under a spectrum.
Methods are ways of doing. Doing well, or doing fast, or doing properly. Methods are transferable, they can be taught and learned, or they can be made up. Most often they are adaptations and combinations. They may lead to a certain comfort, a confidence in the making, an asset which is interesting, but might well also be worth questioning.
My practices as a researcher and communications maker are mostly anchored in digital processes. This means they both depend on and work only thanks to semi automated software processes. It is my opinion that digital practitioners need to constantly (cyclically) review of their practices in order to continue to progress, along with the comfort compromises that need to be made in order to accommodate this unstable practice.
Key words: comfort, value, efficiency, ownership, individuality
It is an unfortunate consequence of priorities in user intended software that computational processes are hidden away behind interfaces. One could argue that there is no direct need for the workings of a system language to be immediately visible for the person using common desktop software. My belief is that this hiding away creates a disconnection between the cultural origins of the program and it's users actions. Of course, I understand the difficulty for a programs interface to expose it's process to the end user, as well as give access to function, but the total opaqueness of the systems runnings for the user results in a rift. A space then exists between the offerings of the program and how it proceeds. What happens can't be explained, but works, so it becomes magic.
This disconnect repeats itself broadly across software and personal computers. All these magical rifts come together and often make it impenetrable for a regular user to peer into the raw materials and formats that lay underneath her or his programs and files. My observations are that because of this constant rift, a large portion of regular users view software as a highly engineered, extremely complex field that only science can manipulate. I'm not here to say that that is totally wrong. However, I'm convinced that if less time was spent on making interface sleek and more attention was given to revealing the sequences that a mouse clicking on a button launches, for example, there would be a large gain in understanding how parts of programs work. Later, a user could notice how different programs call same routines, similar processes, either at similar instigations or possibly on the usage of different functions. I wish programs showed us their sequences, instead, vendors seem to be busy with rebranding their products as applications.
This suggests a model for how software could become more tangible for the regular user, which would result in the understanding that digital practitioners, lower language software users, are only different to the regular computer user in that they, we, have dug a little deeper into the matter. I hope to establish the understanding that doing and making on computers is indeed a culturally crafted and culture crafting practice —albeit one that depends on complex physics and mathematics— but also one that, thanks to this power, includes, extends and acts to develop cultures.
modularity = flexibility
This disambiguation speaks to one of the introductions first claims about value. The recontextualisation of digital practices within active cultures is also partially about redefining where value is derived from. In short this would be to say that value comes through practice, then a minor extension gives: good value comes from good practice, a not so far relative of best practice. I believe it is a strategy to pick and develop methods to aim for best practices within my fields. I aim to make projects that are modular and forwards looking. Maybe if a piece of code is well written enough, if it is flexible enough, then it can serve purpose in the next project, or at least serve as a basis. Specifically, this can mean choosing a certain type of language processor over an other, focusing on how adaptable the situation could later be. The last decade has seen the appearance of preprocessors (meta programs that add programming functionnality) for declarative languages (code languages that expect the language to work within a specific dictionary, and not beyond) which is encouraging to be contemporary with, as a practice.
Building projects modularly demands a great attention to one's tools. Often, a digital craftsman will also keep and maintain a set of sub-tools, routines or functions that can be passed over from project to project. Little snippets that can be called upon to solve a reoccurring problem. In this way, I see many common points again between digital and manual crafts. The caveat though is that some of the latest appearances in digital tools follow the application focused attitude described above. In ways, they set their own boundaries and reduce their potential scope because of the way they act, perform and the realms of culture they wish to live within. These newer tools, be they graphical or code based are what is called proprietary software. This tooling model is one where the original maker of the tool has had a recognition for the value of their work, but decides to displace it as not a culturally included body of work, but more as an item that can be leveraged in a service model for revenue. This ownership model is one where I am invited to use but not modify, which in no way abides to my quest for deep modularity, therefor I chose former methods, ones that prise themselves on keeping all layers of the tool open for use and modification (often enough these will even encourage the redistribution of ones modifications in the interest of widening the culture).
Modularity then is my first chosen method, on the two levels I descibed above, both within the creation and maintenance of sub tools, and in the choice of umbrella processes.
Being conscious of my methods is not always easy. The tools I use all come with attached knowledge, that either needs to be utilised or acquired. A learning curve. This is a clearly necessity for the models I wish to participate in, but the learning of a tools workings can come with some consequences; as the matter handled in the digital environment is so intangible, it's not easy to come up with multiple methods of achieving same results. I mean by this that often a user will learn to apply a function in one way, and will rarely swerve away from it unless this method stops working. This is two fold; on the one hand this integration of a process gives quite some power, it gets executed faster and does not require a granular though, then leading to a degree of efficiency in the work. On the other hand, this doing creates a comfort, a sort of dependency.
It's always worked for me like this, I think it should work like this, why would it work any way else.
Here I see a risk. In comfort of processes comes repetitions. And when it comes to creation, repetition can become quite stale.
I believe comfort is a commodity, but should never be a goal. Rare are the practitioners that aim for comfort, but unwillingly, within certain choices, some senses of comfort might develop. This issue is best explained with a computer desktop example, the very popular OperatingSystemX (OSX) from Apple. This environment is one that I name responsible at least partly for the deepening of the rift I described in the beginning. The OSX system works along a sand-boxing model. This term hails from programming, and gives an idea of where the borders of creation are for software makers. The desktop system boasts a visual and functional consistency across the board. The sandbox lets a software maker write functions and tools, but demands the usage of a certain set of template interface elements for the program to integrate within the rest of the applications. The aim of this is to create a sense of comfort for the user, making sure that there are some constants she or he can expect, that there is some consistency across the board. A quiet comfort. I find this extremely problematic because the sense of comfort it creates is an illusion. I believe this software steam rolling holds users in a tension wherein software starts to look or act unexpectedly, it weakens their quiet comfort, possibly leading to a change of method, a changer of tool, and a definite missed opportunity for practice change and growth.
I try my best to be aware of what comforts I afford myself in my practice. This method is one of curation, always rearranging the tools in the box, reviewing how they get used, and what they help me do, what and how they help me learn. I believe in the idea of risk. Not so calculated risk. If comfort leads to staleness, change and risk seems to lead to development.
The methods I employ are indirect rejections of principals such as ownership, comfort, privacy, efficiency and usability. Like the hiding away of functions behind interface, like the consistency created in desktop environments, like the created senses of quiet comfort, I find a lot of these principals to be selfish and distrustful. In most occasions I can root these issues back to somebody, somewhere having a way of leveraging the construct for financial growth. It saddens me that these strategies are built as long term plans where the user is singularly a customer that will be slowly and repetitively bled and utilised for money and metadata .
Meanwhile, the alternatives I am drawing upon function with one main but extremely significant difference: all of their doings are public. This attitude crosses many dimensions, such as distribution, development cycles, licensing and attribution. It is one simple switch, one simple method that always requires to keep oneself in check. Constant questions, would I do this in public, am I doing something unethical, am I doing something that might offend somebody, am I considering all of the corners of this action. This simple switch is tried and approved by the open source software community as a reinstatement of trust and a template for respect. Answering almost all of the issues listed at the start of this section.
I force myself to employ this method. I believe that this constant check is not only one that helps me keep the review of my practice open, but it makes me always consider the context of my work. As I am writing this very text, I am publishing unfinished version of it, as it builds, with mistakes and reviews and writer notes and uncertainties. The attitude of making things public is one of my key methods to ensure I progress the context of my work in parallel of it's content. https://github.com/colmoneill/drafts/blob/master/2016_06_28-text-on-methods.md
I allowed myself three large detours to explain the reasons for the methods I describe. I did not develop these methods, I adapted them from other practices to fit my own. I believe that this understanding of digital as a craft, it's setup as a modular environment constantly published outwards are key to my practice, which also means that I must step back and look at this sequence as well.