martes, 11 de diciembre de 2012

sábado, 24 de noviembre de 2012

domingo, 11 de noviembre de 2012

time is all we have

blog sobre aprovechar el tiempo. Mirar atrás y disfrutar de lo que has hecho. Dar un pequeño repaso a mi existencia

domingo, 28 de octubre de 2012

are mathematics invented or discovered?

discovery or invention?

In the LinkedIn group about mathematics "Math, Math Education, Math Culture" Opher Liba asked an old and recurrent question in mathematics: Invention or Discovery? This was the title. Many people contributed to this interesting topic. I am not going to make a summary of all the relevant opinions, some of which were quite elaborated. Just let me highlight the new term proposed by Jonathan Visona: innovery (a mixture of invention and discovery). I will limit myself to reproduce my entry. 

I am one of those who think that mathematics is nothing more (and nothing less) than a language: http://www.cut-the-knot.org/language/MathIsLanguage.shtml

Is the language (theormems, propositions, corollaries) already in the grammar (axioms) that specifies it? Well, in some sense it is, but on the other hand you do not care for every possible sentece that can be expressed in the language (leaving aside incompleteness results), but for those sentences or sets of sentences (theories) that are of interest for some practical or theoretical reason.

My opinion is that mathematics are discovered, but the way in which we put everything together and make it "understable" is invented.

Today's quote naturally belongs to J.W.Gibbs: Mathematics is a language.

lunes, 15 de octubre de 2012

main problems III

Other interesting problems

There are other interesting problems that can be studied. I introduce in this post some of them, which I hopefully will try to address in future contributions.

One that I think is very interesting and that I call redundancy can be stated as follows. For a given MGG (matrix graph grammar) decide whether there are redundant productions. A redundant production is one that can be written as a sequence of some of the other productions in the grammar. This is inspired by the notion of base in vector spaces. In essence we are asking to find minimal grammars to express some given language. 

Liveness is a notion from Petri nets, and asks for the (potential) applicability of some production. In fact it can be used as a halting condition for matrix graph grammars, in particular when they are extended using affine productions (a post to come on this). Other concpets can be of interest, also taken from Petri nets, such as boundedness. Enough for today.

Today's quote's from Billy Connely: I have been made redundant before and it is a terrible blow; redundant is a rotten word because it makes you think you are useless.

sábado, 29 de septiembre de 2012

main problems II

Main problems: termination, confluence and complexity

The main problems tackled in the MGG book are applicability, sequential independence and reachability. A lot is yet to be done. There are three more problems that - no doubt -are extremely interesting: Termination, confluence and complexity.

Roughly, termination asks whether a given grammar finishes its execution. A complementary problem is confluence which asks whether a terminating grammar has a unique final state. As you have probably noticed these are the omnipresent (in mathematics) existence and uniqueness. They are not addressed in the MGG book but there are some comments in the first and the last chapters, mainly relating them to reachability and sequential independence.

I introduce a restricted version of confluence in the first chapter of the MGG book, named sequential confluence: Tthe derivations used must be permutations one of each other. Actually the notion of confluence in the MGG book is not confluence as introduced above but one more closely related to independence and sequential independence (to ease the use of the theory developed in the book to study it).

The next natural question is, for a given initial state of a confluent MGG, how long does it take to reach its final state?. This is complexity. Currently it is my main motivation.

Before getting to complexity we need to study MGG as a model of computation and its submodels. Among them I find most interesting the one that does not allow the deletion nor the addition of nodes.

In this post I touch on topics that will be addressed in future posts: Models of computation. My intention is to stay at a conceptual level, ignoring the technical part if possible (even if not possible). I will revisit the main concepts that we have been reviewed in previous posts and will generalize them, introducing new ones such as swaps.

Edsger Dijkstra is for me a source of inspiration: Probably I am very naive, but I also think I prefer to remain so, at least for the time being and perhaps for the rest of my life.

domingo, 9 de septiembre de 2012

main problems I

Main problems: Applicability, sequential independence and reachability

After several weeks of contributions to this blog I realize I have not touched on the driving forces that guide my research. Normally I have some question in my mind that I would like to solve. At times they are problems as general as those that I am about to comment and at times they are much more simple. For me, a general problem is e.g. sequential independence. A more concrete problem is e.g. how to associate a function to a production.

Both general and concrete problems are important. General problems set broad objectives, lead long term research and structure my research. I think it is helpful to have a long term objective and, more important, this objective should be difficult. However, in everyday research, one should have more humble objectives. Otherwise frustration might be just about to knock.

I would like to say that I find some concrete problems as relevant as general ones, and at times even more. For example, the one that I mention above (how to associate a function to a production) seems of capital importance to me. It might open new research directions by bringing in new branches of mathematics or by fitting previous unrelated results into a coherent body of theorems and propositions.

In the MGG book I have addressed and characterized applicability, sequential independence and reachability which are related among them, in my opinion complementing one to each other. 

Applicability. Let a sequence be given with productions in some fixed MGG and a simple digraph G. Is it possible to apply the sequence to G?

Note that applicability is the essence of graph dynamics because a new graph is derived if the sequence can be applied. For a given grammar and an initial host graph, some sequences can be applied and others can not be applied. These are the languages associated to a graph grammar.

Two characterizations are given in the MGG book. One using coherence and the other congruence and initial digraphs (compatibility is needed in both). 

Sequential independence. Do the derivations f and g = s(f) -- where s is a permutation -- applicable to the same initial graph G reach the same state H, i.e. f(G) = H = g(G)?

Sequential independence is a particular case of what I have called the independence problem which states something similar but without one being a permutation of the other. This problem is characterized in the MGG book (Sec. 6.2) using compatibility, coherence and congruence, with explicit formulas for the case of advancement and delay of a production a finite number of positions inside a sequence. 

Reachability. Let two graphs and a graph grammar be given. Does there exist a sequence made up of productions in the graph grammar that transforms the first graph into the second?

The problem is partially solved by extending techniques from Petri nets theory to MGG. In the meanwhile, Petri nets are characterized as a proper subset of MGGs and MGG techniques are applied to them. The relations among them are studied in the MGG book, though not in detail. Applicability, sequential independence and reachability organize all the MGG book except Chap. 7 on graph constraints and application conditions (they are a generalization of productions and are somewhat unrelated to these problems).

In the next post I will write on three more problems that I find very interesting: confluence, termination and complexity. The following quote of Alan Turing seems to fit well with today's post: We can only see a short distance ahead, but we can see plenty there that needs to be done.

miércoles, 22 de agosto de 2012

what are mathematics good for?

Convince me that I have to study maths, dad

Servando Arbolí sent me the drawing that you can find below. It was taken from here. The question is: Why should I learn maths? Or, otherwise stated, why is maths important? To tell you the truth, I have never thought about it. Fortunately nobody asked me before.
 
ask_a_silly_question.jpg

Before thinking, it is always a good idea to google for an answer: Root of all science (sounds to me kinda "root of all evil"), good for training your brain (as exercise for training your body, possibly other brain exercises are easier and more effective), etc. I have not found any convincing answer and I do not have a better one. They are good for me because I enjoy maths, I have a good time reading and doing maths. One of my great experiences was reading Riemann's paper on the distribution of prime numbers. I suppose I simply liked it since I was a boy (possibly due to its exacteness, at a first glance at least) and I suppose I'll stop as soon as it is not fun anymore.

A related recurrent question is: Why all sciences seem to be based on maths?  To me, the answer is that we humans use languages to communicate. Any coherent and exact-to.some.extent language lies in the realms of maths (is maths to me). Hence, ...

Despite my atheism, I shall quote Sir James Jeans: From the intrinsic evidence of his creation, the great architect of the universe now begins to appear as a pure mathematician.

sábado, 4 de agosto de 2012

application conditions II

application conditions and sequences

When an application condition (AC) is defined, we may face some problems. These problems can be described informally as follows: There is at least one graph that satisfies the AC (the AC is consistent); the AC is not a fallacy, i.e. false in all possible escenarios (the AC is coherent) and the AC describes some simple digraph (the AC is compatible).

Coherence typically fails because the AC asks for some element to be present and not to be present in the host graph (the graph the production is going to be applied to). Non-compatibility deals with internal structural inconsistencies. Consistency is concerned with applicability. It is proved in Chap. 7 of the MGG book that consistency is equivalent to compatibility plus coherence.

It is possible to use two operators (closure and decomposition) to transform any application condition into an equivalent sequence or set of sequences. In my opinion this is the main (unexpected) result of Chap. 7. This theorem has important consequences:
  1. All the machinery developed for sequences can now be applied to graph constraints and ACs.
  2. Through sequential independence, preconditions can be transformed into equivalent postconditions and viceversa.
  3. It is another example of an operator equivalent to a (set of) sequence(s). Other examples are the marking operator, the matching and the operator dealing with epsilon-productions.
In fact more can be said: Any precondition is equivalent to some postcondition and viceversa. This is a quite strong statement. It says that ACs are "delocalized" inside the production. Again, more can be said: Application conditions can be moved from one production to another inside sequences. So this "delocalization" happens in sequences as well.

From a more practical point of view, ACs can be used to allow MGG to tackle with multidigraphs with no major modification of the theory. This solves one of the main practical drawbacks of MGG (the other one being relabelling, which I will address in a future post). They can also be used to study other models of computation as particular cases of MGG, such as Turing Machines and Boolean Circuits.

I cite Aristotle today: The whole is more than the sum of its parts.

viernes, 27 de julio de 2012

application conditions I

application conditions and graph constraints

Graph constraints and application conditions are restrictions set on the host graph and on the applicability of a production, respectively. Conceptually, what we try to address is the especification of properties. For example, is a (host) graph three-colorable?, or, is every node of type 1 linked to one of type 2? This topic is important both from the theoretical and the practical points of view. It si addressed in Chap. 7 of the MGG book in full detail.

If we try to describe properties of a single graph, they are known as graph constraints; if there is a production involved they are known as application conditions. Application conditions can be set on the left hand side of the production (precondition) or on the right hand side (postcondition), or on both.

The proposal in MGG generalizes any previous approach to the topic. In essence, a graph constraint consists of a diagram (made up of graphs and morphisms between them) and a logical formula (monadic second order, to be concrete). The diagram specifies the graphs that take part in the constraint and the formula whether it should be or it should not be found in the host graph. Also, the graphs can be existentialy or universaly quantified. An example is "whenever graph A is found, there must exist a graph B linked to it". Of course, much more complicated combinations can be defined.

Now notice that the left hand side L of a production is in fact a graph constraint or an application condition: It asks for the existence of L in the host graph (the graph the production is going to be applied to). In fact, in MGG, application conditions are a particular case of graph constraints. Notice that in order to apply a production we do not only ask for the existence of the LHS (left hand side, L) but also for the non-existence of the nihilation matrix K (see this post). So in fact we have a positive and a negative application condition.

The first thing we have to prove is that MGG is general enough to handle with positive and negative application conditions as well as with existential and universal quantifiers. Notice that the existence of the LHS of a production is an existential qunatifier, but universals are not considered, at least explicitely. Interestingly, Sec. 7.2 proves that it is not necessary to extend the theory in order to cope with application conditions.

I interpret the following quotation from Plato as saying that we have to be very careful when setting the initial framework: The beginning is the most important part of the work. In some sense this was the initial reason for this blog.

miércoles, 11 de julio de 2012

spreading mathematics

accessing mathematics and headaches

Studying mathematics and getting a headache need not be cause and effect. However, many times in seminars or reading some papers one gets the impression of being a little dumb. Well, I'm pretty sure you know those seminars in which everybody's constantly nodding or papers with obvious steps (once written they take three sheets, but they are obvious).

At times I think that we need to be admired for something and in mathematics this "something" is intelligence. The less people that understands something, the more difficult it is, the more intelligent the writer is. Something similar applies to speeches. The strange thing is that if you do not understand something in a seminar, you just keep quiet and spend (lose) one or two hours waiting for the storm to pass you by. Well one should stand up and quietly leave the room.

Our life is short. I should be able to quickly decide if something will be of my interest. If so, then I can dedicate time to it. For established theories there are plenty of tutorials on the web (god blesses the web) but not so for recent research papers or new theories. After all, it is not that difficult to include some examples to illustrate the concepts. Even in those papers submitted for publishing.

I agree in that mathematics are not easy, but most times we make them even more difficult. I think that mathematicians should be aware of the necessity of selling the product (I should include myself). The Bourbaki school which has had a big influence on the twentieth century belongs to the past. We should react and go to the opposite - as it is always the case: thesis, antithesis, synthesis - and try to make accessible even state-of-the-art papers. Good ideas are not difficult.

Another point is that technicalities prevail over conceptual research. In a "deep paper" one expects hard-to-follow equations just above obscure paragraphs. With minor exceptions, new approaches and concepts are not normally recognized though they are usually more influential as they allow us to think from a different perspective.

I confess I'm not a fan of B. Mandelbrot at all, but I do mind the importance of his work. Today's quote I agree, first, because it seems to support the thesis of this post and, second, because for me mathematics is nothing more (and nothing less) than a language: being a language, mathematics may be used not only to inform but also, among other things, to seduce.

sábado, 23 de junio de 2012

congruence

congruence

Congruence or G-congruence (graph congruence) checks whether two sequences (one a permutation of the other) or derivations have the same initial digraphs. It is studied in detail in Sec. 6.1 in the MGG book.

Congruence is used to characterize sequential independence: If the sequence and its permutation are both compatible, coherent and congruent, then they are sequentially independent, i.e. they can be applied to the same initial graph and will eventually derive the same image graph.

Sequential independence means that the ordering of the productions (at least the two orderings specified by these sequences) do not affect the inital nor the final states.

In the MGG book there are some formulas that guarantee congruence in case of a single production being advanced a finite number of positions inside a sequence. They are known as congruence conditions. They can be calculated during design time, while the grammar is being defined, i.e. they are independent to some extent of the initial graph the sequence is going to be applied to.

Albert Einstein: Do not worry about your difficulties in Mathematics. I can assure you mine are still greater.

viernes, 8 de junio de 2012

composition and compatibility

composition and compatibility of sequences

So far I have introduced sequentialization (the productions inside a sequence are applied one after the other). Today I will scratch on parallelism, but first we need to introduce composition of productions.

Consider a sequence. Its composition is just the composition of the productions. Thus, we obtain a single production that performs the same actions than the whole sequence. The natural question that arises with composition is: What do we need sequences and concatenation for?. There are some differences that make composition and sequentialization pretty different concepts.

The first is that of intermediate states: Sequences of productions generate intermediate states while their composition does not  (it is a single production). Not only that, but the final state may differ in both cases. Think of a sequence that first deletes and then adds one node. Had the node any dangling edge, it is deleted by some epsilon-production. However, its composition does not touch the node, so it does not delete those edges.

If two productions are sequentially independent, it does not matter the order of their application, so they can be executed in parallel. But in the MGG book, Chap. 6, it is proved that advancing (or delaying) one production two positions inside a sequence in a single step is weaker (less demanding) than advancing one position and then another position.


In general, if two sequences are going to be applied in parallel, then every production in the first sequence needs to be sequentially independent with respect to all productions in the second. Previous comments weaken this, so it is possible to execute independent subsequences in parallel. See Sec. 5.4 of the MGG book.

Today's post ends quoting Edsger Dijkstra: Elegance is not a dispensable luxury but a quality that decides between success and failure.

domingo, 27 de mayo de 2012

sequentialization and sequential independence

sequentialization and sequential independence

Assume we are given a sequence of productions. Recall that a sequence is just an ordered (finite) set of productions. A natural question is whether some production can be applied before (advanced) or after (delayed) its current position without altering the result of the sequence. In essence, this is sequential independence.
Suppose that our sequence is made up of two productions p2;p1 (p1 is applied before p2 by convention). In this case if p2 can be advanced -- or equivalently p1 can be delayed -- and the new sequence p1;p2 has the same output as the original, then we say that p2 is sequentially independent of p1 (or vice versa).

Sequential independence is an important concept, both from the theoretical and the practical points of view. Theoretically, it extends commutativity to an arbitrary (though finite) amount of elements. Practically, it is closely related to parallel execution of tasks (a future post on this).

Chapter 6 of the MGG book addresses independence for sequences and for derivations (recall that a derivation is a sequence applied to an initial graph). Sufficient conditions for moving productions are stated and proved. In the literature, in order to test sequential independence for jumps larger than a single position, the production is checked to be sequentially independent with respect to every single production in the jump. One interesting result proved in the MGG book is that this is much more restrictive than what a direct jump actually needs.

Sequential independence can be characterized in terms of coherence (see this post or Chap. 4 in the MGG book) and congruence (I will write a post on it; it is addressed in Sec. 6.1 of the MGG book).

Today I quote Henri Poincaré: to doubt everything or to believe everything are two equally convenient solutions; both dispense with the necessity of reflection.

domingo, 13 de mayo de 2012

publishing

My personal opinion on publishing

Today's post deviates from previous ones. I will tell you my opinion about publishing. I am not a fan of the standard way: submit, go to a revision-correction process, (eventually) publish.

There are a few problems I see in the process: it is much easier if you have a well-known surname "in the business" (or more generally, how objective the evaluation process is?), it takes quite a long time (one and a half years is not uncommon) and, most important, which is the real impact?. I mean, nowadays most of us use web engines to search for topics and most research is publicly available in the web. There are other problems that I see but keep for myself (let's be polite).

But my main objection is that once you get through and eventually the paper is published in some journal, potential readers usually have to pay.

Of course, there are good by-products even if the paper is not eventually published. For example, the feedback of the reviewers is normally very valuable. Besides, the author is usually more careful and the paper is revised and well-thought before submitting.

I absolutely support initiatives like the Electronic Journal of Combinatorics and arXiv. Needless to say, everything can be improved but in my opinion these are two examples of how scientific communication will be driven in the future. In fact, I have submitted and got published a paper to the Electronic Journal of Combinatorics.

There are two very interesting critiques that I recommend to you:
  1. The first, written by David Lorge Parnas with title Stop the Numbers Game, revises the big negative impact on current research due to the wide-spread policy of measuring researchers by the number of papers that they publish. In fact, the critique goes a bit further. Thanks to Juan de Lara for letting me know.
  2. The second, signed by 10 computer scientists (Oded Goldreich among them), with title On evaluating conceptual contributions, is an attempt to draw the community attention on this problem: conceptual contributions are considered far less relevant than technical ones. I see mathematics as a language, being concepts the building blocks of the skycraper, so you can figure out what I think of this.
I end today quoting Henri Poincaré: It is by logic that we prove, but by intuition that we discover. To know how to criticize is good, to know how to create is better.

sábado, 28 de abril de 2012

initial graphs

initial graphs (minimal and negative initial digraphs)

One of the main advantages of MGG over other approaches to graph transformation is its ability to study properties of the grammar independently of the initial state. Thus, it is possible to define an algortihm (productions of a grammar) and analyse some of its properties without taking into account the concrete problem it is going to be applied to.

For a given sequence, one natural question is the minimal (smallest) graph that we have to be given in order to apply the sequence. Recall that MGG is not only able to study the elements that have to be present in order to apply a production, but also those that must not be in the initial graph. The minimal initial digraph (MID) is precisely this smallest graph. The negative initial digraph (NID) specifies those elements that should not appear in the host graph. They both can be explicitely calculated. Sections 4.4 and 5.3 in the MGG book are devoted to this topic. It is interesting that we can use this concepts to contribute to Petri nets (Sec. 8.2).

The main importance of the initial digraphs is that they substitute the host graph in those cases where some initial state is mandatory. For example, when dealing with universal quantifiers in application conditions. Hence, the analysis of the grammar productions can be performed without the need of a concrete host graph.

In previous posts I have commented on non-determinism in MGG, being one of its sources how nodes across productions are related (whether two nodes of the same type in different productions have to be the same node or not). If we have in mind a concrete sequence, as soon as the left hand sides of the productions are matched in the state graph the non-determinism disappears. But the interesting part of the analysis is the one that happens before the productions are matched into the host graph. Recall that this non-determinism is classified as vertical and horizontal. The initial digraphs deal with horizontal non-determinism, so they contribute also to the study of the vertical non-determinism.

Depending on the way completion is carried out, different initial digraphs will be derived, giving rise to the notions of initial digraph set. The horizontal non-determinism needs further research. For some time I tried to address this issue using representation theory, which I will comment on in a future post. There is a fast introduction to representation theory here.

Summarizing, initial digraphs are interesting at least for three reasons: They keep independence in MGG of the analysis with respect to the host graph, they also help in the study of horizontal non-determinism and they answer the question of what is the smallest graph able to fire a sequence.

Today I find myself a bit sarcastic. Normally I choose quotes that are related to the topic of the post in some way, but today I cannot stand including the following phrase from David Hilbert: Some people have got a mental horizon of radius zero and call it their point of view.

martes, 10 de abril de 2012

marking

marking places in the host graph

Non-determinism of MGG is a recurrent topic in this blog. It is a property that MGG shares with the rest of approaches to graph rewriting. However, there are several situations in which determinism is needed, even if we want to have a non-deterministic grammar.

Determinism needs arise naturally in MGG in the following three sample situations: Dangling edges, application conditions and splitting rule actions. The first I briefly explained in a previous post: In order to delete any possible edge we append an epsilon-production. Of course we must assure that both productions are applied to the same nodes (otherwise, dangling edges would not be removed).

I will touch application conditions in some future post. Application conditions are just what their name indicate: Properties that must be fulfilled by the host graph in order to apply the production. It should suffice to say that in MGG it is possible to calculate an equivalent sequence to a given production + application conditions. This sequence extends the production and again it is necessary to guarantee that all relevant productions will be applied to the same nodes. If you are interested in application conditions, they are addressed in Chap. 7 of the MGG book.

At times it is of interest to split a production into subproductions, whose overall effect are equivalent. A good example is reachability (some posts on it in the future). If productions of a certain type are split into two productions (one with the addition of elements and one with deletion of elements), then the state equation can gather more information. Clearly, both productions should be applied in the same elements. For reachability, refer to Chap. 8 of the MGG book.

How is this achieved in MGG?. The idea is quite simple but has some subtleties that I do not address here (see Sec. 5.2 of the MGG book): Just mark all nodes the first production is applied to. To this end, introduce a node of a new type (a type not present in the host graph nor in the LHS or RHS of the productions). Link this node to all nodes where the first production is applied. Alter the second production so, first, its LHS must find this element and its relevant incident edges and, second, it deletes this marking node (not its incident edges). Why the second grammar rule should not delete the incident edges is related to sequential independence and is explained in detail in the MGG book.

Today's quote I have chosen from the great Friedrich Nietzsche: there are no facts, only interpretations. I, as a mathematician, completely agree.

lunes, 26 de marzo de 2012

me and the music

A music group called inerKor

Once upon a long ago, two friends of mine (Fernando aka tibu and Javi aka mono) started playing the guitar. They decided to form a group. Soon after I joined them as drum player. Inerkor was born. We tried to include a bassist but after many trials it was not possible. One of our best friends -- Alfonso -- played for a while but unfortunately he left it. So eventually inerkor was a group of three, without bass (who needs it with a double bass drum?).

Our style was influenced by groups like Pantera, Metallica, Napalm Death and many others. Truth is that we spent a lot of time listening to other groups and going to concerts. So you can imagine that our music was pretty noisy, but we introduced certain melodic parts (even with those broken guitars and the double bass drum going full pelt). Really cool.

Our lyrics would have been under a parental advisory label. No doubt. Titles like "suicide time" or "I'd enjoy killing you, motherfucker" say it all. The message was pretty pessimistic, but in line with our feelings and the facts of society we were facing. Even though we really enjoyed, I think anger prevailed in our concerts, music and lyrics. I am not sure anger against what. Looking back I realize that this anger has not faded away, not as that friendship that seems to last forever.

We played for over seven years, giving concerts here and there (one of my dearest memories is our concert in Firenze, Italy), recorded one demo that unfortunately I do not keep (otherwise you could download it from this webpage, be sure) and quit in our best moment I think. I mean that we were good enough to play almost any group and our music was not bad (well, I could not say any other thing).

A few weeks ago, surfing the web, by chance (by a mistake: I introduced the name of the group in the google bar instead of in my gmail account) I discovered a video in which the lead guitar, tibu, was playing kind of celtic music in a collaboration with other people. I have not seen him for over eight years and it was very, very moving. The video is this one.



Many, many things changed in my life at the age of 27/28. Another day, another post, I will tell you about other deep changes. It seems that I got sentimental. Okay, we will return to maths and MGG in the next post. Reading again this post, it does not do Inerkor or my friends justice, so this quote from Gustave Flaubert seems appropriate: writing history is like drinking an ocean and pissing a cupful.

miércoles, 14 de marzo de 2012

derivations III

derivations and direct derivations. Part III (epsilon-productions)

Today I touch on dangling edges. Let p be a production that specifies the deletion of one node (the LHS -- left hand side -- is made up of a single node and its RHS is empty). Suppose the production is applied to a graph with no isolated nodes. What happens with the edges that are incident to the node that p deletes?

There are two possibilities: Either the production can not be applied or every edge incident to the node that is going to be deleted is also deleted. Let us call those grammars fixed grammars and floating grammars, respectively.

In the categorical approach fixed grammars correspond to the DPO approach and floating grammars to the SPO approach.This is precisely the main difference between them. The pushout construction takes care of dangling edges in the SPO approach and the double pushout construction (DPO approach) does not allow the application of a production if any edge is going to become a dangling edge.

MGG is a floating grammar (this topic is addressed in detail in Sec. 5 of the MGG book). This is achieved using one operator, T. The basic idea is to enlarge the production p in order to include (to delete) any potential dangling edge. As it always happens up to now in MGG, an operator turns out to be equivalent to a certain sequence of productions. In case of dangling edges T(p) is equivalent to a sequence of two productions p;pe , where pe (epsilon-production or e-production) deletes any potential dangling edge (pe is applied before p). If a fixed grammar is preferred, it is enough to make T be the identity operator (compatibility is an application condition that MGG has for free).

It remains to guarantee that p and pe will be applied in the same place of the graph (recall non-determinism). To this end MGG uses marking, which is introduced in Sec. 5.2 of the MGG book.

Today I quote Thales. In my opinon, honesty is a must: I will be sufficiently rewarded if when telling it to others you will not claim the discovery as your own, but will say it was mine.

domingo, 26 de febrero de 2012

derivations II

derivations and direct derivations. Part II (non-determinism)

In a previous post I introduced derivations in MGG. Today I want to give you my personal interpretation of what a direct derivation is. Also, there is one problem that needs to be addressed in order to move forward: Potential dangling edges. MGG takes care of them through so-called epsilon-productions (e-productions). I will address them in the next post.

One question that I find pretty natural is why applying a production to some host graph is so complicated. There are several steps involved. For example, in the categorical approaches (SPO/DPO) we use one pushout construction (or even "worse", two pushout constructions) to define what a direct derivation is. My personal view is that the pushout construction is a device to transform a production into a function. This is not the standard point of view, in which the grammar is/specifies the function.

However, I think that the underlying idea of a production should be that of a function, which is straightforward. Direct derivations as introduced so far in the literature are not (they are not straightforward and they are not functions). Recall that a production applies a single graph into another graph. I think it should rather be a receipt to transform graphs into graphs (production and grammar rule are synonymous; the term rule and receipt are not that far in my opinion).

Nonetheless, my main objection to the pushout construction is its inherent non-determinism. In a more functional style I would say that it defines a multivalued function (one that assigns several different outputs to a single input). This is a far-reaching subject. For example, if SPO was used as a model of computation, e.g. to study the PvsNP problem, we would not get too far. Unfortunately, I think we would not even start because the definition of the complexity class P could not be done. By the way, I am not aware of any attempt to use/study DPO or SPO as a model of computation, even though most of the research in this field is carried out by computer scientists. No doubt, this is an extremely interesting topic.

How can we avoid this non-determinism in the different approaches to graph transformation? In my personal opinion, I think we cannot or at least I do not see how (I mean an easy way). How can we avoid this non-determinism in MGG? Recall that in MGG we can split the dynamics of the production from its statics, so to speak. One possibility are the so-called swaps, which are introduced in this preliminar paper (the full version has been published in the Electronic Journal of Combinatorics).

The obvious drawback of swaps is precisely that we lose non-determinism. Actually, it can be recovered but we shall leave this to a post dedicated solely to swaps.

Today's quote (Johnson Samuel) I do not agree. In fact, I think this is probably the main problem in mathematics and this blog's very initial reason for being: Sir, I have found you an argument. I am not obliged to find you an understanding. It is also possible to illustrate one's opinion using counterexamples...

viernes, 17 de febrero de 2012

derivations I

derivations and direct derivations. Part I

Recall that a production (or grammar rule) specifies how one graph is transformed into another. Following the intuition behind automatas or Turing Machines, we want to establish a procedure to pass from one state of a predetermined system to another, hence simulating its behaviour. In graph grammars, the state of a system is fixed by an associated graph. The idea is very simple: Apply the production to the initial state of the system and derive a new (system) state. This is precisely a direct derivation: The application of some production to some graph. A derivation is just a sequence of direct derivations.

The question that rises naturally is how this application is performed. The answer depends on the chosen approach to graph transformation. For example, in DPO the construction consists in essence of two double pushouts diagrams while in SPO it is a single pushout construction. Whatever the chosen approach is, the following steps are always fulfilled:
   1. Select grammar rule.
   2. Find occurrence of the grammar rule's left hand side (LHS) in the host (initial) graph.
   3. Check application conditions.
   4. Remove elements that appear in the left but not in the right hand side (RHS).
   5. Glue the elements of the right hand side to the graph of previous step.

MGG also follows more or less these same steps. Let's briefly comment on them. Regarding the selection on rules, there is no standard way (notice that a sequence precisely specifies this order of application). This is a clear source of non-determinism and we shall come back to it when we comment on MGG as a model of computation, sometime in the future.

The occurrence of the LHS is carried out through matching. This is addressed in Chap. 5 in the MGG book. There are two equivalent proposals, one using category theory (similar to the categorical approaches) and the other (novel) defining an operator that enlarges the LHS of the production. The basic remark is that a match is nothing but a production that transforms the LHS into the host graph. Then, we can define an operator that basically performs a continuation of the production (enlarges the production) such that it is perfectly suited for the graph it is going to be applied to.

I will write some posts on application conditions and graph constraints, so I will not touch step 3 here. Steps 4 and 5 are easily addressed through Boolean operations, as explained in this post. As nodes can be deleted, step 4 is potentially unsafe because some edges may dangle. Next post will be on dangling edges and how they are addressed in MGG.

As stated in previous posts, the way MGG handles productions and direct derivations has several advantages, and no real inconvenient as far as I can see. Please, comment if you have a different opinion.

Today I quote Oliver Heaviside. Should I refuse a good dinner simply because I do not understand the process of digestion?

sábado, 28 de enero de 2012

sequences and coherence

A sequence in MGGs (aka concatenation) is just an ordered set of productions (aka grammar rules). In mathematics one usually has a countable amount of ordered elements and studies the limit. However, when studying grammars and languages one usually has a finite number of grammar rules. (The language generated by a grammar is more or less its set of finite sequences.)

I like to think that the importance of sequences stems from the fact that they destroy the non-determinism of "the next rule to apply". A complexity point of view. Recall that there is another source of non-determinism in MGG: The place of the host graph in which the production is applied.

There are some novel concepts related to sequences introduced in the MGG book: Compatibility, coherence, initial graphs and congruence. Today I touch on coherence and possibly the next few posts will brush over the rest.

Productions inside a sequence are applied in order, so it can be the case that one production performs one action that troubles some production that has to be applied later. Coherence guarantees that this is not going to happen. Notice how close coherence is to applicability (the possibility to apply the sequence to some initial graph, thus deriving a new graph: the output of the sequence). In fact, it turns out that a sequence can be applied if and only if it is compatible and coherent.

Today's quote is from Jon Von Neumann, and I would like to dedicate it to my friend Álvaro Iglesias: In mathematics you don't understand things. You just get used to them.

miércoles, 11 de enero de 2012

the very basics III

The very basics of graph dynamics: the nihilation matrix.

Today I touch on the nihilation or nihil matrix. Recall that a production transforms one graph (L) into another (R). When a production is applied to some host graph G (which represents the state of the system under study), L is found in G and substituted by R to derive graph H (which is the new state of the system).

Notice that the left hand side of the productions (L) specifies what elements must be present in any host graph in order to apply the production. The nihilation matrix specifies what edges must not be present in order to apply the production. It is represented by K. There are two sets of edges that should not appear: Those incident to nodes that are deleted by the production and those that are going to be added by the production.

It is important to notice that the nihilation matrix only makes explicit some implicit information. Then, what is it good for? Actually it is extremely useful. First of all, it can be used to characterize compatibility (closedness of the set of graphs with respect to productions of the grammar). Second, together with L and R, they define the natural framework to study application conditions and graph constraints. Most importantly, nihilation matrices are the key idea to relate MGGs and complex analysis, swaps, etcetera (an introduction here).


I have said that the nihilation matrix just makes explicit some implicit information. We know that a production p transforms L into R. So in principle we should have all ingredients to know which are the forbidden elements in the image of the host graph G. Interestingly, the image of the nihil matrix K (which I will represent as Q) evolves according to the inverse of the production:

R = p(L) \longmapsto (R,Q) = (p(L),p^{-1}(K))

More on nihilation matrices can be found in Secs. 4.4 and 7.4 of the MGG book. Let's finish today with a nice cite from Karl Friedrich Gauss: It is not knowledge, but the act of learning, not possession but the act of getting there, which grants the greatest enjoyment.