2008年12月3日星期三

Week 12

Main topic of week 12: pumping lemma, context free grammer

Since not all the expressions are regular, ie, string with same number 1s and 0s, so the idea of having pumping lemma is to prove some expression that is regular. The main scheme of pumping lemma is that if you got a DFSA denotes L has p states, then every string has length greater than p can be written as uv^kw where v is not empty. Idea behind this is that somewhere in the string will generate a loop back to some states, since the total string is greater than the total number of states. So we could repeat that cycle K times, and the string uv^kw will still be accepted by L.

2008年11月25日星期二

Week 11

Main topic of week 11: NFSA, equivalence of DFSA AND NFSA, cartesian product

A very closed definition to DFSA, the only differece I see is in NFSA an empty transition is allowed, and transition function are now sending from state to set of states. In summary, the DFSA just a special case of NFSA where empty transition and multiple state transition is not allowed.

DFSA and NFSA are equivalent and Every regular expression can be expressed by using DFSA or NFSA. To change a NFSA back to DFSA, we need some additional symbols.hatQ(all subset of the Q) HagSigma(same as Sigma), HatDelta(transition function that transfer all the state that current state can reach by a single transition. HatS(set of starting states) and HatF(non empty Final states).

cartesian product is used when we need to design a machine that has more than one independent requirement. For example, a string with odd length and odd number of 1s, then we could use two DFSA to keep track of the length of the string , and the number of 1s in the string. Or we could use cartesian product, to produce a machine with 4 states(keep track the lenght and 1s at the same time).

At last we learnt how to convert a FSA back to regular expressions. By eliminating one state at a time.

2008年11月18日星期二

Week 10

Main topic of week 10: equivalence of the regular expression and DFSA

Additional problem of proving two regular expression are equivlent, same as to prove the equivalent of regular expreesion and laguages.

Deterministic Finite State Automata is a powerful tool to examine whether a specific string satify a given formula. Some general attribute of DFSA containing Q(states), sigma(alphabet), delta(transision function) and s, and F. Extended transition function delta* take a string to a state, rather than a single character. DFSA may reach some state that don't have link to other states. Such state is called dead end, which can be eliminated out when drawing a DFSA diagram.

2008年11月11日星期二

Week 9

Main topic of week 9: language and regex

A brand new material starts here and of course, a lot of definitions need to remember in this week. These inclued Strings, alphabet, length, and languages. Language is a set of strings over some alphabet. It has some operations such as intersection , union, kleene star and complement.

Regular expresion are used to denote a set of language in a special structure. It is also defined in a recursive way. Which has base cases single character, empty string and empty set, and if R and S are defined regular expressions, so do R+S, RS and R*. Language which can be denoted by regular expression are defined in the same sort. Base cases are include language that can be denoted by regular expression of empty string, single charcter and empty language, and if language one and two are defined by some regular expression, the language of union and concatenation and kleene star will also be defined in the regular expression.

More example of the regular expressions are provided here, with a little alrithmetic operation we could find some regular expression are the same easily.

To prove a language denoted by regular expression is equivalent to other language, is by proving mutually inclusive, that is, the expression is a subset of the other language A, and A also can be denoted by this regular expression.

2008年11月4日星期二

Week 8

Main topic of week 8: named repetition and prove of iteration

As disscussed previously, recursion is an extension of naming repetion process. A recursive function is formed by providing base case, and generating body. Iteration works in the simailary way, but with little variation. Unlike recursive function, iteration will have an end point somewhere, or it provided infinite loop and yield false result. Thus when proving the correctness of a iteration function, we need first to find the loop invariant that goes toward the final result, and a termination that garentee the function will end. Termination sometimes is not very easy to find, and definitely require some thinking on it.

2008年10月27日星期一

Week 7

Main topic of week 7: function correctness Definition

A function is being correct if when precondition holds, after executing the function, post condition follows. So, in order to prove a function is correct, we generally assume the precondition is true, and use induction to exam each step what happend there. Gerneralize all the result and compare it with the post condition. When proving it, we usually divide our proof into different cases, and form them we derive the postcondition.

2008年10月20日星期一

Week 6

Main topic of week 6: complexity analysis As introduced in week 4.

Function complexity is the sum of the time complexity in each child function. Obvioulsy, "child function" makes sense when the function calls himself somewhere, meanning the function is recursive. Then the function's time complexity may be based on quantities like length of the list or string. If the child function calling himself with shorter or longer length of the list or string, then it makes some progress towards the base, and we can design our recusive time complexity T(n) based on the length. One thing worth to mention here is that the complexity analysis is based on the worest case scenario, which will simplify the caculation quiet a bit.

Idea how to prove the "uncontinous" recursive function. Because, for example, the length of a list in a recursive function may not stricly decrease or increase by one in each child function call. In the mergeSort(), we find the length of the list to be considered in child function is half to the length of the list in it's parent function. That will make our Time complexity function in a series of discrete points n. So in order to prove it, we could first prove the list of length n = 2^k(in mergeSort() example), thus we know k is decreased by one during each subfunction call. And then prove the function is monotonic, meaning either non-increasing, or non-decreasing. Finally we prove all the possible length lies between each 2^k and 2^k+1 length list.

After finding the time complexity function, we can bounded function by some other functions. Thus big O and big omega is introduced here agian.

2008年10月13日星期一

Week 5

Main topic of week 5: closed form of recursive function and induction definition.

We further navigated some example on how to get the closed form of a function. Here is the idea:

1. Make a general conjecture of a closed form format

2. Build a linear equation based on the closed form.

3. Solve the equation by using different n values.

Then we talked about the definition of recursive set, for example, a binary tree can be defined recursively. In this sense, the most powerful tool to prove the correctness based on a recursive defined set would be induction, because they have very similarly structure.

2008年10月5日星期日

Week 4

Main topic of week 4: recursive definition.

Recursion is a special naming processing, and it is similar to induction, which gives you a general body and a base on which the body will be built on. It is easy to define a recursive function, simply by control it's domain and range. But not all of the function are expressed good enough in recursive way, for example f(n) = f(n-1) surely is not a well defined recursive function, because there is no base case here to tell where the function should end.

Now, given a recursive function. we have a way to guess what the function will look like in closed form by a process called unwinding. Because the recursive way of def, the function can be substitute by it's child function, which makes the function approach to it's base. An interesting recursive function would be the Fibonacci sequence, in which not only one base case, but two of them, need to be provided. We often need to calculate the sum of the first n term of the function, for example, we need to remember the total number of rabbits raised before would be the sum of the Fibonacci sequence. Similarly, if a time complexity function is provided, we could use the sum of the function to indicate how many steps we need to complete on function.

In summary, we learnt recursive definition in this week and a powerful tool--"unwinding" to get closed form of recursive function, and how to find the time complexity function and complexity evaluation.

2008年9月28日星期日

Week 3

A new proof structure Well Ordering is introduced this week. Among three proof structures, I find well ordering is the hardest one to utilize. It is like prove something backwards when using well ordering, because we are going to use the "smallest elements", and in most of the time, derive a contradiction to what we assumed. Take A1 q3( the golden ratio) for example, if phi = n1/n2 = n2/ n1 - n2, so if we get the set cotaining all n1, n2, ..., nk, where n = n - n and we find this set is not going to cotaining any smallest element, thus by well ordering we find phi is not rational.

One thing worth to review is the relationship of three proof structures.

2008年9月21日星期日

Week 2

We begin to learn an additional flavor of induction in this week. That is, complete induction. This idea is introduced alone with an example of proving every natrual number greater 1 has factorization. Simple induction can not be used for solving this problem because it's hard to judge which natrual number should be the suited base case and also, it is impossible to list all of them. Complete induction on the other hand, try to prove the last "domino" will fall by assuming all it's previous domino fall.

I think these idea are basically same, simple induction try to prove next statement by using it's neighbour while complete induction use all possible statements that proved before. So in this sense, the concepte of complete induction seems border than simple induction and I find it is easier to apprehand simple induction if we learn complete induction first.

2008年9月14日星期日

Week 1

Most of week one's material is related with CSC165, Danny showed us some proof examples and I'm very glad we are not going to use the same proving model anymore.