The Problem:
The list of positive whole numbers that add up to 1 is simply (1), and you might think of the product of this list as being 1.
There are several lists of positive whole numbers that add up to 2, and the lists (1,1) and (2) have different products if you multiply the list elements: 1 and 2. The lists of positive whole numbers that add up to 3 yield products 1, 2 and 3, corresponding to the lists (1,1,1), (1,2) and (3).
If n represents a positive whole number, what is the largest product that can be formed by multiplying the elements of a list of positive whole numbers that sum to n?
Devising a Plan:
The first thing to do is to create a table with three columns with the first column being the value of n, the second column being the lists of whole numbers that sum to n, and the third column being the list from the second column with the highest product. In my opinion, we should fill this table at least until n=7 so that there is enough solid data to work from. This, in a way, would be the base case of the proof. Then the conclusions made from this table should be used in order to generate the predicate that will be proved.
Carrying out the Plan:
After creating the table for the first few values of n, we see the pattern that the list with the maximum product is always the one that consists of either 2s or 3s or both. From this observation we can assume this fact for P(0)/\.../\P(n-1) and using the method of complete induction we can prove that P(n) is also true. This is possible since any whole number, larger than a certain whole number c, can be produced from the sum of previous whole numbers, which have been assumed true by the Induction Hypothesis.
Conclusion:
The above was a summary of the processes that need to be executed in order to end up with a complete induction proof for the problem. The table could be used in the place of a base case and as a clue to the formation of a predicate. From that point the proof becomes very straightforward and intuitive.
WEEK #12
This might be the second last post on this blog since I only have the problem solving episode left and I am not sure about a posting for the thirteenth week of lectures. Assignment #3, the final assignment, was due on the Monday of this week and I must say that it was comparatively easier to assignment #2. Most of the questions were straightforward with only the last question being a little vaguely described. The lectures of this week covered the topic of non-regular languages and the pumping lemma. I did not fully understand the pumping lemma as well as I liked and probably have to take another look at it, so hopefully it won't be included in Term Test #3. Regarding the problem solving episode, I am not sure what problem to undertake yet I'll probably end up picking one from CSC165's bunch. I am also not sure on the exact format which I should lay-out my findings and solution.
WEEK #11
I think this is the second week in a row that I am attending Thursday's evening lecture and having attended the morning lectures all the preceding weeks I must say that in my opinion the evening one is much better. This is due to the fact that the topics that are going to be covered within that week are presented in one sitting and therefore you get a better understanding of them. I must admit that, in general, 3-hour lectures are not too great but CSC236's is actually very good (probably due to the professor).
On another (related) note, the lecture for this week started with a method of proving if a DFSA works or, in more technical terms, ends up in an accepting state when it is suppose to. From that the topic slightly altered from DFSAs to NFSAs, which seem to be a condensed version of DFSAs where multiple states are combined in a single one. Then there was another slight turn towards the topic of cartesian products, which in this case was a machine that was a combination of at least two other machines. The lecture was concluded with an interlink of NFSAs and cartesian product machines.
On another (related) note, the lecture for this week started with a method of proving if a DFSA works or, in more technical terms, ends up in an accepting state when it is suppose to. From that the topic slightly altered from DFSAs to NFSAs, which seem to be a condensed version of DFSAs where multiple states are combined in a single one. Then there was another slight turn towards the topic of cartesian products, which in this case was a machine that was a combination of at least two other machines. The lecture was concluded with an interlink of NFSAs and cartesian product machines.
WEEK #10
Week 10's lectures began with a build up on regular expressions (regexps) which were introduced last week. There were more discussion on the equivalency of regexps and the types of algebra that can be performed on them. Then there was a smooth transition into Deterministic Finite State Automatas (DFSAs) which are machines that can be used to verify the correctness of inputs. These machines have starting states, normal states, and accepting states along with transitions which moves from one state to another depending on the next character of the input. The first formal example that was shown in lecture was a DFSA which accepted binary strings that were multiples of 3 in base 10. I really liked regular expressions but becoming very interested in DFSAs as well, and the good thing is that both topics intertwine so you get the good of both worlds.
WEEK #9
This week we started on a relatively different topic, Formal Languages. Professor Heap covered the basic yet fundamental topics of strings (where languages are formed from sets of strings), string operations, language operations (operations performed on sets of strings), and last but not least an introduction to regular expressions. The funny thing was that CSC207 and CSC236 seemed to start on the topic of regular expressions at approximately the same time, yet the CSC236 version seems much more theoretical compared to the more application oriented method of CSC207. In both cases, regular expressions seem to be a very powerful and useful tool.
WEEK #8
Assignment #2 was due the Monday of this week and all of the questions were challenging, with some more than others. Yet the first question seemed much harder than the rest which was surprising since it was postponed from the previous assignment to this one. The topics in this week's lectures is a build-up on last week's introduction to program-correctness. Loop invariants were explained in more depth and it is seen that they are key for determining the termination of a loop or in other words getting closer to the main goal after each iteration. The problems became more challenging with trickier loop conditions and double/nested loops. No problem set for this Friday and no assignment for the coming Monday so some time to sit down and review for next Friday's term test.
WEEK #7
The week started with a different professor than usual and he began the topic of program-correctness which was reviewed and continued in the following lectures by Professor Danny Heap. This procedure seems very obvious at the moment for the reason that it is being performed on functions that we know will terminate and return the expected result, however it is probably one of the most important aspects of software/algorithm testing. One of the main problems with functions and programs is that they would go into an infinite loop and which could solved by checking them with program-correctness methods. Problem Set #4 was also postponed to the following Monday which is a relief from the heavy workload of the past couple of weeks.
Subscribe to:
Posts (Atom)