Keynote at 4developers: The Game Of Life – Java‘s Siblings and Heirs are populating the Ecosystem

Posted by Michael Hunger on Mar 29, 2010 in code, development, java, programming languages |


I was invited to give a keynote talk at the 4developers conference in Poznan, Poland.

Topic

I’d liked to talk about the Java.next programming languages on the JVM and polyglot programming. When pondering how to address this issue, two things came into my mind.

The first was that one could see the JVM and its languages as an evolving ecosystem with changing external forces and take a natural view in discussing the whole issue.

But how to compare the languages?

I was reminded of the impressive Game Of Life in APL youtube video that I first head of from Uncle Bob Martin.

Some of my own explorations of Scala and clojure were also connected to the Game Of Life problem. While watching Ola Bini talking about Ioke at QCon London 2008 I browsed through the Ioke repository and found a game of life example, which didn’t have the properties Ola proposed for Ioke. So I spend some of the walks to the conference on how to create a more functional solution. I implemented those in Scala and Clojure (published on github).

The third inspiration came from Allan D. Holubs book "Holub on Patterns" where he provided a Game of Life Java solution filled to the rim with design patterns. That was not what I thought to be a maintainable solution.

Getting to work

So I took my implementations, some references to ecosystems, monoculture, biodiversity, evolution and started to create the keynote. As I am not a Scala and Ioke expert I asked for help.

I’d like to thank my friend Sam Aaron for providing the Idiomatic Ioke Game Of Life, which he also blogged about and
Heiko Seeberger for the Scala version. I added a simple Java version for comparison, wrote a concise Groovy one and reused my clojure implementation.

The Game of Life

I let John Conway explain the Game of Life himself, this was a very well received introduction.

Game of Life as Image Processing

Two days before the keynote when cycling to work I was thinking about the talk and then it struck me that the common image operations like kernel matrix multiplication and treshold lookups could be easily used to do a GoL implementation on images. So I took an hour and wrote a small Java program doing exactly that. Impressive how concise it can be (whole program at github) – just add up the neighbours and make the weight of the original cell special so that you recognize it in the sum.

        private static final int COLOR = 0x484848;
        private final ConvolveOp convolve = new ConvolveOp(new Kernel(3, 3, new float[]{
                1, 1, 1,
                1, 0.5f, 1,
                1, 1, 1}));

        private final LookupOp lookup = createLookupOp();

        private LookupOp createLookupOp() {
            final short[] lookupTable = new short[256];
            lookupTable[180] = 0x48;
            lookupTable[216] = 0x48;
            lookupTable[252] = 0x48;
            return new LookupOp(new ShortLookupTable(0, lookupTable), null);
        }

        private void nextGeneration() {
            image=convolve.filter(image, null);
            lookup.filter(image, image);
            generation++;
        }

At the conference

Traveling to Poland was a bit of an adventure, I went by train as this (6 hours) should be shorter as getting to Berlin and flying from there. The first train was a rail car. Had to upgrade to first class to get a power outlet. In Poland the train stopped for more than half an hour and so I missed my connecting train in Wroclaw. But I figured out the connection and took the second one. Was collected from the main station by the organizers and taken to Boveria to the speakers diner, where they served delicious food and the host Anna Kolodziejczyk explained that we’ll not be accepted by the polish audience if we didn’t tasted vodka, so we were served 5 rounds of different types of vodka which was also delicous. After that we walked through the town (at 1am) to the Hotel.

Next morning an 6am the hot sun woke me and fortunately my head was clear :).
The keynote went really well, except for Murphy at the beginning as Anna disconnected my Mac from the projector (had already set up all the stuff) and camino crashed on reconnect, restarting it restarted all the videos and made a mess. I used my Kymera Magic Wand to advance the presentation and gave it later to the conference organizers as prize for the drawing.

I was very pleased to meet Hadi Hariri, the Jetbrains Technology evangelist, who welcomed me with the question: “You’re a Java guy, what’s your IDE?”, which I answered correctly. He also asked me, why I’m not a member of the JetBrains Academy. A thing we changed now.

Meeting Peter Horsten, the owner of goyello was also a very satisfying experience – thanks for all the insights.

More Resources

Source Code

At least some snippets of the sourcecode I showed

Scala by Heiko Seeberger

class Generation(val alive: Set[Cell] = Set.empty) {
  require(alive != null, "Illegal argument: alive must not be null!")

  def next: Generation = {
    val stayingAlive = alive filter { (2 to 3) contains aliveNeighbours(_).size  }
    val wakingFromDead = alive flatMap deadNeighbours filter { aliveNeighbours(_).size == 3 }
    new Generation(stayingAlive ++ wakingFromDead)
  }

  private def neighbours(cell: Cell) =
    for {
      dx <- (-1) to (1)
      dy <- (-1) to (1)
      n = cell[dx,dy]
	  if (n!=cell)
    } yield n

  private def aliveNeighbours(cell: Cell) = neighbours(cell) filter { alive contains _ }

  private def deadNeighbours(cell: Cell) = neighbours(cell) - filter { cell => !(alive contains cell) }

  override def toString: String = {
    val rows = (dimension._3 to dimension._4).reverse map { y =>
      val row = dimension._1 to dimension._2 map { x =>
        if (alive contains Cell(x, y)) Generation.Alive else Generation.Dead
      }
      row.mkString
    }
    rows mkString "\n"
  }

  private lazy val dimension =
    ((alive map { _.x }).min,
     (alive map { _.x }).max,
     (alive map { _.y }).min,
     (alive map { _.y }).max)
}

Ioke by Sam Aaron

GameOfLife = Origin mimic do(
 initialize = method(rows:, columns:,
   @grid = Grid withDimensions(rows, columns))

 evolve = method(
   nextGrid = @grid blankGrid
   @grid seq each(cellInfo,
     if(#{2,3} include?(cellInfo numLiveNeighbours), nextGrid spawnCell(cellInfo coords first, cellInfo coords second)))
   @grid = nextGrid
 )

 spawn = method(row, col,
   @grid spawn(row, col)
   @grid)
)

GameOfLife Grid = Origin mimic do(
 Cell = Origin mimic

 withDimensions = method(rows, columns,
   self with(
     state:     Dict withDefault(0),
     numCols:   columns,
     maxRowIdx: rows - 1,
     maxColIdx: columns - 1
   )
 )

 blankGrid = method(self with(state: Dict withDefault(0)))
 spawnCell = method(row, col, @state[[row, col]] = 1)
 seq       = macro(allCells seq)

 ;internal methods
 permutations = method(a, b, a flatMap(i, b map(j, (i,j))))

 countLiveNeighbours = method("Count the number of live neighbours for a given set of cell coordinates",
   row, col,
   neighbourCoords = permutations((-1..1), (-1..1)) - [(0,0)]
   neighbourCoords inject(0, sum, (r_mod,c_mod), @state[[row + r_mod, col + c_mod]] + sum))

 allCellCoords = method("Generate a list tuples representing all the coordinates of the cells in the Grid",
   permutations(0..@maxRowIdx, 0..@maxColIdx))

 allCells = method("Generate a list of all the cells in the Grid consisting of coordinates and number of live neighbours",
   allCellCoords map((row, col), Cell with(coords: (row, col), numLiveNeighbours: countLiveNeighbours(row, col))))

 ;for output purposes
 asList = method("Returns the list of cells as a list of 0s and 1s, with 1 representing a live cell",
   rowsOfCoords = allCellCoords seq sliced(@numCols)
   cellList = []
   while(rowsOfCoords next?, cellList << (rowsOfCoords next map((row, col), @state[[row,col]]))))
 asText = method("Returns the list of cells in a pretty ascii art representation",
   "\n+-#{"--" * (@numCols)}+\n| #{asList map(map(i, if(i == 0, "  ", "* ")) join) join("|\n| ")}|\n+-#{"--" * (@numCols)}+")
)

Groovy by me

class GameOfLife {
  static int X=0,Y=1;
  static def env = [[-1,-1],[0,-1],[1,-1],[-1,0],[1,0],[-1,1],[0,1],[1,1]]
  def alive = []

  def neighbours(def cell) {
    env.collect { [cell[X]+it[X],cell[Y]+it[Y]] }
  }

  def aliveNeighbours(def cell) {
    neighbours(cell).findAll {alive.contains(it) }
  }

  def deadNeighbours(def cell) {
    neighbours(cell) - aliveNeighbours(cell)
  }

  def haveNeighbourCount(def coll, def counts) {
    coll.findAll { counts.contains(aliveNeighbours(it).size())}
  }

  GameOfLife next() {
    def stayingAlive = haveNeighbourCount(alive, [2,3])
    def wakingFromDead = alive.inject([]) { res,cell ->
                          res += haveNeighbourCount(deadNeighbours(cell),[3])}

    new GameOfLife(alive: new HashSet(stayingAlive + wakingFromDead))
  }
  String toString() {
	 (alive.min{it[Y]}[Y]..alive.max{it[Y]}[Y])
		.collect { y ->
		(alive.min{it[X]}[X]..alive.max{it[X]}[X])
		.collect { x ->
			alive.contains([x,y]) ? "X" : "."}
			.join("")+"\n"
			}.join("")
  }
  static GameOfLife fromString(def str) {
	 int x=0,y=0;
	 def alive=[]
     str.each { if (it == 'X') alive+=[[x,y]];
		if (it=='\n') { x=0;y--;}
		else x++;
	 }
     new GameOfLife(alive: alive)
  }
}
def gol=GameOfLife.fromString(
"""
   X
    X
    X
  XXX
""")
200.times{ println gol = gol.next() }

Transcription

Good morning,

It feels good as always to be with such a big crowd of passionate software developers who come to conferences to learn and share.

Unfortunately I caught a cold from my kids. So my voice will not be in its best shape. Please bear with me.

Some words about me. I’m a software developer like you who is passionate about his craft and enjoys writing good software everyday. Besides reading a lot and contributing to open source projects I started talking and writing about the things I do and I’m interested in. Instead of writing my own, I rather like to help people to get their books published so I reviewed several over the last years.

But lets rather dive in the interesting part of this morning. I was inspired to give this talk by Uncle Bob Martin who talked to Scott Hanselman about design principles and the goal of creating better abstractions to work with. They talked about language, library and API evolution to provide better concepts that encapsulate more details and make it easier to grasp and extend existing systems. During this talk Uncle Bob referred to APL which with its really big set of symbols (you even needed special terminals and keyboard for programming APL) is a very expressive programming language. For demonstration purposes he pointed to the Game Of Life demo in Dynalog APL that is available on youtube. You will be able to enjoy parts of it later.

While watching it I was really impressed by power of the language that (by choosing a fitting approach) made building up such a complex statement that easy. It made me thinking about the Game Of Life itself, the power of abstractions within a language and where we do stand right now in regard to this.

You certainly know that bringing together such different areas as software development and biology is a tricky thing. Richard Stallman of the GNU foundation even condemmed people that tried to apply the term ecosystem to software systems.

But I think there are some valueable ideas in looking at our working environment as a kind of ecosystem. In the end it is the place where we spend most of our time (besides our beds). So this time I’d like to take a natural view of the java programming ecosystem.

Some important aspects that come into mind and should be considered are listed here, so we have the ecosystem itself. On both ends of the spectrum there are our man made monoculture and the original biodiversity. But there would have been no development and improvement without the evolutionary principles and the forces of selection that were made popular by Charles Darwin.

When talking about ecosystems you talk about their inhabitants and relationships as well as forces and external influences. We know many differnt ecosystems some still natural many artificial. They are influenced by internal development and external forces. Some are very sensitive to changes (say coral reefes) other quite stable.

Looking at our workplace there are many parties involved, our customers and organizations, our team and us developers. The other side is hardware, infrastructure, tools, languages and libraries. When you imagine a healthy ecosystem it contains positive, supporting relationsships. So I think of a habitable one that makes working enjoyable and fun. But the ecosystem is not stable it evolves all the time, the human parts are always learning and changing and the technical side evolves as well (a good question is which side develops faster).

Today I’d like to look only at a small part of this ecosystem, although the other parts are not less interesting. I want to focus onto the tools and languages we’re using, especially the JVM and its languages.

Let’s start with the JVM. Thats really a cool piece of technology. An execution environment that abstracts away hardware, operating systems and much more. So we gain portability. But not only that it’s constantly evolving and improving. We’ve already seen massive performance improvments in the last years, especially around memory allocation and IO. For example in the form of the experimental DaVinci machine that is the playground for VM engineers experimenting with many different aspects. But it doesn’t run a single programming language. Even that is abstracted away. The bytecode that is executed on the JVM although originally based on the Java language can be produced by any compiler for many languages. And it is. But thats one step to far for now.

Lets first look in the darker age. We’re talking monoculture here.
Monoculture is an artificial thing. To yield higher crops a single species is selected and installed into an ecosystem. Then standardization and economy of scale is used to drive the margins higher and higher. But nothing comes for free. Any ecosystem that is only based on one or a few species is very sensitive to external forces like disease, climatic or other environmental change. There are no safety nets, no supporting relationships that can protect the ecosystems. Another bad aspect is that monoculture ecosystems intend to stop outside influences and so evolution, development and improvement from within the system itself.

And we had our share of monoculture. During the 90 and early 2000’s Java was the lone regent of the JVM. In the beginning it was slow as hell. But technical issues improved. We were then struck by enterprisy standards and architectures. This was all about economy of scale – how many Java developers and companies are there? About standards and increasing profits, especially for the big vendors. In the mainstream there was little knowledge about other languages on the JVM, let alone interest to learn them. Java in itself was extended time and again, the libraries grew enourmously. Too keep compatibility the extensions were crippled and many interesting things left off. The .net platform caught up easily and drove ahead. But fortunately things began to change for the better. But why? And what is better? Ok the latter first.

Natural diversity. If you have a natural ecosystem there is a rich selection of different species occupying many niches and often supporting each other in manyfold ways. Such a system is much more resistant to disruptions than a monoculture it can repair itself or at least recover quickly. Also as not all inhabitants are influenced in the same way by external forces the impact is much less. In such an ecosystem the responsibilities for different tasks are distributed between its members. Some can be responsible for providing energy or protection others for creating biomass etc. As there are so many species involved the areas for evolution are manyfold as well.

How does this translate to the JVM and its languages? As already mentioned the JVM could run many languages. And it did. But this went unnoticed by most people. There are different lists of JVM languages but it is assumed that about 400 languages exist that run on the JVM. Most of them of course minor ones that were used in education or experimentation. All those languages address different problems and have different interesting properties. Most of the languages are more expressive than Java itself. A (non-exclusive) list will follow.

What else changed after the dark ages? Alternative frameworks for developing applications emerged some authentic like the springframework other inspired by web framework sucess stories (RoR) like grails. We got post EJB OR-Mappers and many alternative data stores, scalability and messaging solutions there has been a lot of movement in the library as well as in the language camp these times. Many of the libraries quickly employed the more advanced java language (annotations, generics) as well as library features (concurrent, nio).

But why? What happend? Let’s look at evolution for some questions and answers. This year is Charles Darwins 200th birthday who collected lots of evidence during his travels that the species evolved from common ancestors and that external forces drove a selection process that enabled the individuals that were adapted best to survive.

What forces did appear that made changed the Java ecosystem?

Cost of hardware dropped more and more and cost of highly productive and skilled developers rose. So we can move the focus from machine performance optimization to human performance optimization. That means focusing on the problem that code is much more read than written, has to be understood, maintained and kept in good shape / design. So the understandability and expressiveness of the code that we write must increase dramatically.

The cheap hardware comes with a price. It is not just faster machines but more cheaper machines with more cores. Computing power and storage are just aggregated. That creates the Concurrency Challenge. Most people say that writing correct concurrent code is hard (only some disagree). And using the low level means for concurrent control makes the code much more compless and less understandable. So there is a new quest for better abstractions and means for parallelization ans scalability.

With the increasing power of languages, tools and hardware the complexity of the systems we create grows dramatically. This has to be countered somehow. Many approaches favor simpler programming models that take away lots of repetitive boilerplate code and let us focus on solving the problem domain.

And then there were competing languages and frameworks that had no equivalent on the JVM. Of course the sucessful ones were copied or ported to the JVM to get their benefits on the familiar execution environment.

So there were are, right now many of the languages that already existed or came into existence get the attention of the mainstream. They are adopted for lots of different projects and gained quite a lot of momentum. Besides the languages of course frameworks and infrastructure also evolved but they are not my focus today.

Say something about each language.

I want to address three of the forces that are targeting the JVM ecosystem.

First the concurrency challenge. As already said todays hardware is not necessarily faster but cheaper and plentiful (and multicore) and writing correct and robust concurrent code is hard (as everyone who’s read Brian Goetz’ Java Concurrency in Practise can confirm). Other paradigms that focus on immutable data and idempotent operations are much easier to parallelize. Having side effects and shared state makes it much more difficult. That’s why functional approaches have such a big appeal for concurrency. Shared state is for instance addressed in clojure with an implementation of Software Transactional Memory (STM). Languages like Erlang have a lightweight actor model baked in, Scala, the akka framework and hopefully soon Erjang do address this on the JVM.

Expressiveness is imho a very important force. We have to find better abstractions that can be used to build the complex systems wer’re faced with today. So getting rid of uneccessary code, using conventions, having higher level syntactic constructs (high order functions) for working with collections and closures as low profile callback mechanism helps writing more expressive code. For instance groovy developers often declare that they can take many java classes and cut them down to 25% of the size. That’s just what the language and libraries give you.

Other languages enable you to write succinct code using the power of metaprogramming and open types. I think that is one of the big limitations that you can’t easily extend the existing Java types within the language.

Having a simpler lifecycle that doesn’t necessarily include a compilation step also increased the effectiveness of developers, just reloading makes it fun. On the JVM the only solution I know of that provides this right now is the fantastic JRebel java agent that gives you instant hotswapping after (incremental) recompiliation.

But there is another aspect, when dealing with the problem domain, no one talks about technical constructs. It is all about the language of the business. But if we look at the code that results from our insights it is hard to spot the original concepts. Creating little languages / APIs that express the concerns of the business in a readable manner (i.e. DSLs) is a solution that rises the awareness of what we’re trying to achieve. But creating those (internal DSLs) within Java is not so funny and readability is only given for programmers. But many of the other (especially dynamic) languages and scala were designed to and are well suited to create internal DSLs.

Some more thoughts on the competition force. I think this is one of the mayor forces that let the alternative languages become commonplace.

Especially in the realm of web frameworks the convention over configuration and opinionated approach of Ruby on Rails had a big impact.

Other relevant areas were the proven scalability and reliability of erlang, the conciseness and robustness of functional approaches and the impressive development in the .net space. Even the Flash/Flex as competitor for the rich client was addressed by Sun by developing and promoting the JavaFX scripting language.

Many tried to copy the successes and so adequate solutions were created for the JVM.

Many people are talking about polyglot programming and the new tower of babel. I’d like to quote two well known figures (funnily both are thoughtworkers) here.

First Ola Bini a language geek, who is contributor to the JRuby implementation and has recently created his own language experiment called Ioke. He is talking about the “end of big languages” meaning that they will loose their importance and will be more and more replaced (in certain areas) by specialized languages. From his experience with JRuby he is certain that the JVM is a good polyglot environment.

He said that the current development will lead to applications that are composed of several language layers. With a stable language at the core surrounded by more dynamic languages. The already mentioned domain issues will be addressed in a third domain layer that then contains DSLs, specific domain oriented APIs and the core domain model.

Interestingly but understandably he doesn’t see Java as the future stable language in the core but something else. Perhaps Scala.

Neal Ford speaks especially about polyglot programming and points to the fact that in most projects we already use a lot of languages to achieve our goals. This trend will continue so that specialized languages are employed to fulfill certain objectives. He is also a strong proponent of DSLs as means of communicating with the domain user and having them discuss and validate what we are writing in our code.

Now lets get a life. All the talk about ecosystems, diversity and the biological viewpoint were just stepstones to this part. This is the meat of this talk, you’ll see real code, lots of it and see stuff running and have me discussing some of the interesting bits.

I already talked about my first inspiration for this talk, but there were two more.

There was another experience that led me here. When I attended QCon two years ago I watched Ola Bini presenting Ioke an experimental, expressive programming language based on aspects of lisp, io, self and other sources. While watching the talk I browsed the repository and came along this implementation.

I didn’t found it to be very expressive or succinct. So I started thinking about other, functional approaches to the Game Of Life which led me to into writing Scala and clojure versions to experiment with my ideas.

And a third one.

While having a discussion about the evilness of getters and setters and keeping state of objects internal on the qi4j mailing list I encountered Allan D Holubs importer and exporter pattern for the first time. These made me intersted and I got his book. And funnily about one third of it was devoted to discuss a Game Of Life example in which he added as many patterns as he could think of to the poor cellular automata. This clear overuse of patterns and the use of Java as language makes it a perfect example how bad a well meant intent can go. I think this is clearly overdesigned and every customer that just wanted to have a Game Of Life would not be satisfied with the effort it took to write and it will take to maintain and extend.

But like with life, the universe and the rest lets start at the beginning.

What is the Game Of Life?

It is a simple cellular automata with a small set of rules that governs the life of cells from one generation to another.
It was invented 40 years (already) ago by the mathematician John Conway and is the subject of many different research and programming projects ever since.

But I think he tells the story himself best of all.

Video shown.

Whats the point of using the Game Of Life as means to explore languages? It is a simple example but not a simplistic one. You can see the basic program structure as well as list operations, filtering, decision making and depending on the completeness also things like visualization, string parsing and rendering. The succinctness of a language becomes easily evident. If you choose to use a more scalable approach other properties of languages and libraries can be shown (esp. parallelism, sharding, optimized data structures).

99 bottles of bear is another example of this, there is even a website that has collected examples in hundreds of programming languages for rendering the lyrics of the famous song.

There are many different approaches to solve the problem. They all have to count neighbours of a cell and manage generations. But thats all the commonality. The two dimensional array approach is the fastest but doesn’t deal with the infinite board. A list of coordinates (or rather set) based approach doesn’t have this problem and may also consume less memory.

Another differnce is if you have a sequence of immutable generations or mutate the datastructure itself. Both have differnt options for optimizations.

There is even a highly optimized version called HashLife that uses hash based cashing of the similar subpatterns and calculates them only once.

And of course there are limitless variations for visualization and different rulesets and board layouts.

But lets start with some code.
First a simple Java approach I wrote yesterday that uses just a set and filtering. You see the repeated loops over the collections, generics.

The second example I want to show you are the core classes of Allan D. Holubs version of life that I already discussed earlier. You can certainly recognize different design patterns, but the sheer amount of code is impressive.

Then I wrote a small Groovy solutions that uses the same approach but with lots of idiomatic list operations, it takes less than half the space the Java variant needs. It contains no for loops and no mutating state. I also added a simple to string representation that just calculates the two corners of the field and then emits a X for each living cell on the board.

I just want to say for all languages that I’m not an expert in every of them. Certainly there are more idiomatic and concise solutions for this problem.

Next on the list is a idiomatic Scala version by Heiko Seeberger, who is a very active Scala programmer and for instance creator of the Scala Modules OSGi DSL.

It is very simliar to the groovy version. The main difference is that Scala looks that way although it is statically typed. A second difference is the _ob_____ of parentheses for single argument methods and dots for invocations. You can also see scalas for comprehension in action here that are much more powerful than a java for loop and also offer lazy evaluation.
Scala has much more to offer. As a object functional language it combines the power of both, offers a lot of syntactic sugar and many powerful programming constructs (traits, implicits, tuples, pattern matching, case classes and more).

Clojure is an interesting language. It is based on lisp constructs. And as many of you certainly know, lisp already had many of the features of current languages (and even more) in the 1960s. Clojure adds quite a bit to lisps power. It has immutable data structures and mutable ones via STM. Clojures Java integration is very well implemented, you can actually write the same code with less parenthesses than in Java itself. Clojure supports the high order functions as well as monads like let or list comprehensions. A well layed out API, lazyness and other things make clojure an environment that is fun to work with.

The final example for a JVM language is Ioke the experimental language by Ola Bini. It is a prototype based language that is completely message based and offers very powerful macro facilities. Sam Aaron who provided the example is one of the committers to the Ioke project.

The last example of overwhelming expressivness is presented in Dyalog APL. Please watch closely and try to keep up what he’s doing and explaining.

We’re almost done here.

There is still the question how can you scale this up on the JVM. Fortunately the new languages, frameworks and datastores make this easier possible. A first step is to partition the grid over multiple machines. If much of the calculation is happens locally then the coordination effort is not an issue. Using erlang style message passing between actors you could also model the cells as actors that are spawned or die. Livelyness information as well as well as synchronizing clock would then be send as asynchronous messages. The Akka framework provides an actor implementation in Scala, it can handle 6 million actors on a commoditiy machin with 4GB of RAM. Its supervisor mechanisms can partition the actor cell cloud into manageable chunks. As this approach is asynchronous it has to handle the correct synchronisation of the life-flow within each generation.

The MapReduce approach which was popularized by google and is also available with Apache Hadoop and other frameworks could be used if you see the live rules as a two step process. In the first step each cell bleeds its life in equal parts in its neighbouring cells. And in a second reduce step those life fragments are counted and then the liveliness of the cell is determined.

A very performant life implementation is Hashlife that takes the regularity of the resulting patterns into account. Each pattern is stored in a quad-tree hash and calculated only once. There are several Hashlife implementation that perform “explosively”.

When riding by bike to my client two days ago I thought about the keynote and different approaches and as I did a bit of image manipulation the whole game of life rules reminded me of the well known kernel matrix convolute operations that are so common in image processing. Counting the neighbours with a kernel matrix is easy and differentiating the center pixel by assigning it a different value as well. The resulting values just have to be mapped to valid colors according to the GOL rules using a LookupOperation. Thats it. Almost no code, scales well and can use the available 2d acceleration available with current GPUs. So I wrote a small demo for this technique.

A language that is better suited for the needed abstractions is the first step. Evolving this thought we could easily imagine a language for defining cellular automata that encapsulate implementation details and expose them only as configuration options. This language would use the vocabulary of cells, grids, generations and evolutionary rules. So in essence describe the ecosystem of the game of life.
You can and should consider this when solving the problems your faced with. It is much easier, easier understandable and maintanable represent the intended solution when you created first a language or abstraction to define the problem concisely. Thats what the lisp guys have done for decades now but we’re slowly getting there.

Share and Enjoy:
  • Print
  • Digg
  • del.icio.us
  • Facebook
  • LinkedIn
  • Netvibes
  • PDF
  • Ping.fm

3 Comments

Leave a Reply

XHTML: You can use these tags:' <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Copyright © 2007-2017 Better Software Development All rights reserved.
Multi v1.4.5 a child of the Desk Mess Mirrored v1.4.6 theme from BuyNowShop.com.