Friday, April 4, 2014

Week 12: Finale

Looking back to the semester, I regretted slacking off most of the time. I didn't practice coding as much as I would like to, and the recursion stuff still scares me.

I did terrible for test 2, so the only solution is to go through all the past exams, labs, and lectures to hopefully be comfortable enough to do the exam. Thankfully, there are solutions to the test, so I can learn from my mistakes.

I didn't particularly appreciate the idea of SLOG. It was not as enjoyable as writing codes. I'd rather get more quizzes or exercises, anything directly related to coding.

Last but not least, I'm well aware that it's my own responsibility to keep practicing. When you do bad at something and feel frustrated about it, it's harder to persevere and continue doing that. This coming study break, I'll face my fears and fix all the problems I created. I'm sure that if I try harder, I can still get 80s.

Good luck everyone!

Saturday, March 29, 2014

Week 11: Sorting

I fell ill this week, so unfortunately, I missed the Wednesday and Friday lecture. But the good news is, this week's suggested reading is incredible. (Thanks, Dan! Good choice!)

There are detailed diagrams with well elaborated explanations. There's code right under the diagram that lets you run it on the webpage itself. MC questions to make sure you understand the concept. What more to ask for?

As a minor detail, I noticed the selection sort in our course finds the minimum and insert it to the left instead of finding the largest and inserting it to the right.

When I saw O(n lg n) in the slides, I was confused as of where did lg n come from. So I started with the recommended reading. For merge sort, I found out that log n refers to the number of times you divide the list in half, where n is the length of the list. As an example, I found this from stack overflow:
"For example, starting from, say, 128 items, you divide into groups of 64, 32, 16, 8, 4, 2, 1 items -- 7 iterations (and log2(128) = 7)." (Source: http://stackoverflow.com/questions/10425506/intuitive-explanation-for-why-quicksort-is-n-log-n)
Also, n is related to the merge operation because you compare a leftlist[i] with a rightlist[j] at a time to get the smaller one, and the actual work is done where you append the smaller one in the list.

The mechanism of quick sort was harder to grasp than merge sort for me, so I'll let my mind process it a bit more before I read it again tomorrow.

Sunday, March 23, 2014

Week 10: A2's Wrath

A loving father would let his child take a glimpse of true hardship, not to let the child fear for eternity, but to arm him with tools to fight for himself.

Most UofT courses tend to serve us a tremendous amounts of deadlines near the end of the course. In these torturing weeks, I had to face assignment 2 with fatigue. Had I known how hellish this week would be, I would've certainly pushed myself to start the assignment on day 1. The logic involved in this assignment is quite difficult to grasp. Once you understand the general idea of regex, as I did after reading the handout plenty of times, your goal becomes clear. Problem is, understanding your task is only step one of the whole assignment. Anyhow, once I understood what I'm asked to do, and happily coding away, the brackets of regex dot and regex bar starts haunting me. As normal, I try to look for hints on the discussion board and from Google search. This time, Google had nothing for me because python has its own regular expressions and methods tailored to it. So my only guide was myself and the discussion board. I wished I never knew people used helper functions for is_regex. 

I had been thinking on the right track the entire time even if my solution was nowhere close to the model answer. But, I was too hung up on the idea that my idea would take forever and an easier way out is mostly likely a helper function. Countless hours had been wasted with me staring at the code and desperately finding an answer to why would I want a helper function for. Eventually, after 12 hours, I finally decided to work on my idea and got it finished half-decently. My only regret is I never went back to it and add one line of code that would address all cases, which is a simple
else:
    return False
line inside my brackets-addressing case.

I really wish to know how outstanding minds work. Recursion now strikes me as intricate ideas presented in a minimalist manner. Just like how the most amazing performers work for years for one performance, and the outcome is just a few hours of performance that some may overlook and never understand its value. But I digress... I want to know how brilliant minds filter ideas after sufficient brainstorming. Because when I got stuck, I had no will to carry on, not even for some simple trial and error. Also, there were barely any ideas in my head except one. I simply didn't understand how else I could've approached the problem. This is the first time I felt so powerless for this course, but I least I learnt to start early for all assignments. At least by starting early, my mind can process the problem subconsciously and automatically come up with ideas over time.

This assignment was another wake-up call to my habits. And hopefully I can make up my incapability to rely on my immediate wits with patience and determination... I mean... Not procrastinating and strictly enforcing it from now on.

Sunday, March 16, 2014

Week 9: Complexity

One thing I love about CS is the logic involved. The signature approach to the permutation problem was really neat. It strike me that seeing and solving a problem from another angle can ultimately be the solution you're looking for. Except, how does one come up with a different approach? Unfortunately, I do not have an exact answer for this one. Perhaps you'll eventually come up with it if you expose yourself to the problem regularly.

This week's lab, we were confused by how you can call a Node and expect the function to find the appropriate class to solve the problem. Our TA explained that it was Dynamic Binding, where the function called depends on the argument or object.

For example,

class BST(object):
...
    def count_less(self, item):
        """(BST, object) -> int
        Return the number of items in this BST that are strictly less than
        item.
        """
        return self.count_less(self.root, item)
        # The line above calls BSTNode.count_less(item) because self.root is a _BSTNode
 ...

class _BSTNode(object):
...
    def count_less(self: '_BSTNode', item: object) -> int:
        """
        Return the number of items in the BST rooted at this node that are
        strictly less than item.
        """
        if self.item < item:
            return self.count_less(self.left) + 1 + self.count_less(self.right)
        elif self.item >= item:
            return self.count_less(self.left)
        else:
            return self.count_less(item)

# Strangely enough, it calls the class addressing None Nodes at the right time as well:

class _BSTNone(_BSTNode):
...
    def count_less(self, item):
        """(_BSTNone, object) -> int
        Return the number of items in the BST rooted at this node that are
        strictly less than item.
        """
        return 0

I'm still not sure how it works, so I Googled a bit. It's called late binding as well. It refers to runtime binding, as contrast to early binding that refers to compile time binding. (Source: http://stackoverflow.com/questions/10580/what-is-early-and-late-binding) That made absolutely no sense to me, just like the Wikipedia page. So I found this website: http://www.i-programmer.info/programming/theory/1477-late-binding-myths-and-reality.html

There's a detailed explanation with an example in C# language. I decided that I should probably worry about it later and spend time to look at this week's code and maybe practice a bit instead.

Friday, March 7, 2014

Week 8: Lab on LinkedLists

This week's lab is a follow-up to last week's task. We did the part called implement list  (sort of). A similar recursion was used for LinkedList traversal. Such was mentioned in last week's blog. At some point after our lab on Tuesday, Dan added the extra section for the lab.

I did copy_ll:

  def copy_ll(self: 'LinkedList') -> 'LinkedList':
    """Return a copy of LinkedList.
    >>> lnk = LinkedList(5)
    >>> lnk.prepend(7)
    >>> lnk2 = lnk.copy_ll()
    >>> lnk is lnk2
    False
    >>> lnk == lnk2
    True
    >>> lnk.prepend(11)
    >>> lnk == lnk2
    False
    """
    if self.empty:
      return LinkedList()
    return LinkedList(self.head, self.rest.copy_ll())

I tested the docstring but it failed on the first lnk == lnk2, so I ran it manually and constantly printed lnk and lnk2 to compare them:

>>> lnk = LinkedList(5)
>>> lnk.prepend(7)
>>> lnk2 = lnk.copy_ll()
>>> lnk
LinkedList(6, LinkedList(7, LinkedList(5, LinkedList())))
>>> lnk2
LinkedList(6, LinkedList(7, LinkedList(5, LinkedList())))

They're the same, so I suspected the lack of __eq__ method is to blame. I proceeded to implement __eq__.

  def __eq__(self, other) -> bool:
    if self.empty and other.empty:
      return True
    elif self.empty or other.empty:
      return False
    return self.head == other.head and self.rest == other.rest

Then it worked. That was too much effort because I kept fixing copy_ll to pass the docstring test, and was quite late to realize the lack of __eq__ was the culprit.

Tired now. See you next week :P

Friday, February 28, 2014

Week 7: Linked Lists

When first introduced, linked lists appear to be abstract and hard to grasp. The idea of "next" or "rest" does not strike me as a straight forward concept until explained by our TA, Bryan. This is what I've come to terms with at the end of the week:

Node 0: empty; The variable to the class instance. In other words,
  >>> a = LinkedList()
Node 0 is basically a, used to call the first node, node 1, of the newly created instance of the class LinkedList. So, node 0 points to node 1.

Node 1: first element of the linked list; similar to the first element of a traditional python list. It points to node 2, i.e. the "rest" or "next" of node 1 is node 2.

Node 2:  second element of the linked list; similar to the second element of a traditional python list. It points to node 3, i.e. the "rest" or "next" of node 2 is node 3.

Consider node one as the head, the rest would be node 2, node 3, node 4, until the end because all the nodes are linked and point to their own next.

24.2 of the reading explained this similarly, but with graphical aid and shell code to illustrate it.

The methods to implement LinkedList are generally recursive. As the lab activity suggests, once you understand one code, the others tend to be similar.

The key is to start with the base case to handle None, then save rest as temp to preserve them while you change the head. Also, to traverse the list, use index value and recursively call index - 1 with the next node until index value puts you to where you want to be.

That's it for the week! Hope we all did up to our standards on our midterms!

Friday, February 21, 2014

Week 6: Trees

I've never been a morning person. Especially not on Mondays. But, ever since I switched to Comp. Sci., the life of returning to the lecture hall every Monday morning wasn't the same anymore.

But I digress...

Back on Monday, Dan covered tree terminologies, tree traversal, then followed up with code implementation the rest of the week. The tree terminologies seem simple enough. "Parent" and "siblings" makes sense to about anyone. But, the tree traversals aren't as simple as it looks.

On page 11 of week 6's slides, it says:

"Preorder: Visit the root node, do a preorder traversal of the
 left subtree, and do a preorder traversal of the right subtree

Inorder: Do an inorder traversal of the left subtree, visit the
root node, and then do an inorder traversal of the right
subtree

Postorder: do a postorder traversal of the left subtree, do a
postorder traversal of the right subtree, and visit the root node"

I thought I understood tree traversal after reading these three definitions, but once we got to the actual example, the inorder traversal was far from what I expected.

After a simple Google search, I found a website with a more detailed example and explanation for a CS beginner like me.

Here's the link: http://datastructuresnotes.blogspot.ca/2009/02/binary-tree-traversal-preorder-inorder.html


(This is a picture from the same website.)

The Inorder traversal is done recursively. Here's how the author of that blog explained it:

"(i) Traverse the left most subtree starting at the left external node, (ii) Visit the root, and (iii) Traverse the right subtree starting at the left external node.
Therefore, the Inorder traversal of the above tree will outputs:
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10"
It cleared my concept quite a lot, as I originally thought you start with the left subtree and just enumerate each level is an ascending order, then visit the root, and visit the right subtree eventually. This is wrong. But, to further illustrate my wrong concept, here's an example of what my method would create using the above figure: 1, 0, 3, 2, 5, 4, 6, 7, 9, 8, 10.

Of course, don't do that!

That's all for the week. Thanks!