sábado, 28 de enero de 2012

sequences and coherence

A sequence in MGGs (aka concatenation) is just an ordered set of productions (aka grammar rules). In mathematics one usually has a countable amount of ordered elements and studies the limit. However, when studying grammars and languages one usually has a finite number of grammar rules. (The language generated by a grammar is more or less its set of finite sequences.)

I like to think that the importance of sequences stems from the fact that they destroy the non-determinism of "the next rule to apply". A complexity point of view. Recall that there is another source of non-determinism in MGG: The place of the host graph in which the production is applied.

There are some novel concepts related to sequences introduced in the MGG book: Compatibility, coherence, initial graphs and congruence. Today I touch on coherence and possibly the next few posts will brush over the rest.

Productions inside a sequence are applied in order, so it can be the case that one production performs one action that troubles some production that has to be applied later. Coherence guarantees that this is not going to happen. Notice how close coherence is to applicability (the possibility to apply the sequence to some initial graph, thus deriving a new graph: the output of the sequence). In fact, it turns out that a sequence can be applied if and only if it is compatible and coherent.

Today's quote is from Jon Von Neumann, and I would like to dedicate it to my friend Álvaro Iglesias: In mathematics you don't understand things. You just get used to them.

miércoles, 11 de enero de 2012

the very basics III

The very basics of graph dynamics: the nihilation matrix.

Today I touch on the nihilation or nihil matrix. Recall that a production transforms one graph (L) into another (R). When a production is applied to some host graph G (which represents the state of the system under study), L is found in G and substituted by R to derive graph H (which is the new state of the system).

Notice that the left hand side of the productions (L) specifies what elements must be present in any host graph in order to apply the production. The nihilation matrix specifies what edges must not be present in order to apply the production. It is represented by K. There are two sets of edges that should not appear: Those incident to nodes that are deleted by the production and those that are going to be added by the production.

It is important to notice that the nihilation matrix only makes explicit some implicit information. Then, what is it good for? Actually it is extremely useful. First of all, it can be used to characterize compatibility (closedness of the set of graphs with respect to productions of the grammar). Second, together with L and R, they define the natural framework to study application conditions and graph constraints. Most importantly, nihilation matrices are the key idea to relate MGGs and complex analysis, swaps, etcetera (an introduction here).

I have said that the nihilation matrix just makes explicit some implicit information. We know that a production p transforms L into R. So in principle we should have all ingredients to know which are the forbidden elements in the image of the host graph G. Interestingly, the image of the nihil matrix K (which I will represent as Q) evolves according to the inverse of the production:

R = p(L) \longmapsto (R,Q) = (p(L),p^{-1}(K))

More on nihilation matrices can be found in Secs. 4.4 and 7.4 of the MGG book. Let's finish today with a nice cite from Karl Friedrich Gauss: It is not knowledge, but the act of learning, not possession but the act of getting there, which grants the greatest enjoyment.