Sunrise

I got to see Haitus Kaiyote at Outside Lands 2016 last weekend. They were one of the best bands I’ve seen live so far. Seeing them live gave me a newfound love and appreciation for their music and I’ve been listening to their songs all of last week. Not only is their music beautiful, but the pictures they paint with their lyrics are gorgeous as well, and some of them seem to appeal to my current state of mind.

 

Quotes: July Edition

I wrote a blog post talking about how I write down lines from books I’m reading. Here are some lines that I loved from books that I read in July:

We are what we pretend to be, so we must be careful about what we pretend to be. — Mother Night

No one has mastered the art of life. Everyone is just stumbling in the dark. — Here

It pays to be obvious, especially if you have a reputation for subtlety. — Foundation

To succeed, planning alone is insufficient. One must improvise as well. — Foundation

Memories sting when they come suddenly. — Foundation

Goal Tracking: July Edition

At the beginning of the year I published a post outlining what some of my goals for the year were. In the spirit of being transparent, here is the progress I made on them over the course of July –

  1. Was there for everyone who needed me for most of the month.
  2. Volunteered for 0 hours.
  3. Quite a bit of procrastination.
  4. Honest and open. As always.
  5. Made some good progress on learning Rust! I wrote about it here and here.
  6. I read four books over the course of July: Mother Night (another amazing book by Kurt Vonnegut), Here (I LOVED the artwork in this graphic novel. I wish it had a better story though), DC/Dark Horse: Aliens (an action-packed, fun read), and Foundation (This is the first Asimov book I’ve read and I loved it! It was quite unlike any other sci-fi book I’ve ever read, and I can’t wait to read the rest of the books in this classic series).
  7. I read 2 research papers: Disruptor and COST. I’ll be writing about them soon.
  8. I wrote 8 posts.
  9. I learned to play at least one song on the guitar. Minus the solo.
  10. (a) (goal achieved)
    (b) (goal achieved)
    (c) (goal achieved)
    (d) I ran the first half of the SF Marathon!
    (e) (goal achieved)

Electron

I’ve found myself listening to electronic music more often these days. I’ve found that it makes for quite good coding music. And writing music too.

Here’s a subset of what I’ve been listening to:

  • Jake Bowen is the guitarist for Periphery (one of my favorite bands; I have VIP tickets to see them in August). He also makes amazing electronic music.

  • Tycho. He’s from San Francisco!

  • It’s hard to classify the genre of music The Algorithm plays. All I know is that I really like it.

  • Com Truise made a single with Deftones (another favorite band of mine; I’m very excited to see them in August)

Oxide

I just finished sections 24 to 36 of chapter 4 of the Rust book. Here’s what I felt:

    • Associated types seem like an improvement over generics. They seem like an important concept to write effective Rust code and I wish that this chapter had gone into more details and had a larger example.
    • Rust supports macros. As the chapter mentions, I probably wouldn’t write one unless I absolutely had to. If Rust supported variable number of arguments to functions one could probably implement vec! using that plus generics.
    • unsafe seems like a very powerful and tricky Rust feature. I wish the chapter had an actual example that demonstrated how to use unsafe correctly. And also an example of when not to use unsafe — for example when you’re writing bad Rust code and using unsafe to mask a bad design.

(You can find my thoughts on the previous chapters / sections here)

Change

We are halfway through 2016, and already this year has been full of big changes for me on several levels.


After nearly 3 years (including my Summer internship) at LinkedIn I left the company and joined Uber in May 2016.

“There are three things we cry for in life: things that are lost, things that are found, and things that are magnificent.” — Douglas Coupland

I cried on my last day at LinkedIn. And it was because of those three things that Douglas mentioned (and perhaps more): I had to say goodbye to some great coworkers, I found amazing friends while at LinkedIn, and the entire experience of working for LinkedIn was magnificent.

All that being said I love my new job at Uber and am extremely happy at this company!


I’m more fit than I was at the beginning of 2016.

I started lifting more seriously around February 2016 and definitely feel stronger. In the past I used to focus on a few specific body parts while ignoring/not really caring about the others. I made a conscious effort to change this and can already see/feel the benefits. The blisters on my palms have only gotten worse though😦.

I’ve started running more as well. I’ve gone from struggling to run 3 miles at the beginning of 2016 to being able to run 8 miles (my longest run was 10 miles on 07/06/2016) without any problems. This wouldn’t have been possible without the changes I made to my diet.


My love, appreciation, and respect for my family and friends has increased. Thank you for everything. You mean the world to me. I’m incredibly lucky to have people like you in my life.


In April 2016 I decided to become an atheist. Prior to that I was a Hindu. It took a lot of thinking and self-reflection for me to make this decision, and I don’t think I’m quite ready to talk about it publicly on this blog yet. If you’re interested in why I made this change feel free to message me or talk to me in person.


I’ve started using hair styling products. Because why not?🙂

Ferric

I spent the past few days working through the first 3 chapters and the first 23 sections of chapter 4 of the Rust book. Here are my initial thoughts on Rust:

    • Cargo is pretty sweet. It’s an easy to understand build and dependency management system. Even though I’m only using it for simple things so far I’m really happy with it and have not run into any issues. I’ve also gotten very used to running cargo new <project_name> [--bin] to start new Rust projects.
    • Compared to Go, Rust is a much larger language, with many more concepts to learn. Go is a simple language to learn. I feel that Rust has a steeper learning curve.
    • Memory safety is one of Rust’s strongest selling points. It is one of the trickier concepts to understand and is unlike anything I’ve experienced in C, C++, Java, Go, Python, etc. I’d say the concepts that come close to resembling it are unique_ptr and shared_ptr in C++11. Consequently I spent the most time on the three sections dedicated to references and borrowing, lifetimes, and mutability. Most of the bugs that I ran into while writing Rust code were also related to these concepts.
    • Rust has generics. This was something I missed in Go.
    • I haven’t gotten used to writing functions without an explicit return yet.
    • The Rust book is very well written but there are a few areas of improvement. My major gripe is that it introduces new language constructs but doesn’t explain what they do. For instance the chapter on trait objects introduces the format! macro for the first time without explaining what it does, the chapter on closures uses a Box to return a closure from a function without going into what exactly a Box is etc.

 

Intelligence

Inspired by a tutorial on TensorFlow that was on HN recently I decided to go and read the TensorFlow paper. This paper has been sitting in my “To Read” folder for quite some time now but for various reasons I never got around to reading it. This is also the first AI/ML paper I’ve read in 2016 so I was excited to dive right in.

At 19 pages long this is one of the longest papers I’ve read. But it is extremely well written, with lots of diagrams, charts, and code samples interspersed throughout the text that make this paper fun to read.

The basic idea of TensorFlow, to have one system that can work across heterogenous computing platforms to solve AI/ML problems, is incredibly powerful. I fell in love with the directed graph API used by TensorFlow to describe computations that will run on it (this may or may not be related to the fact that I also love graph theory). The multi-device (and distributed) execution algorithm explained in the paper is quite intuitive and easy to understand. A major component of multi device / distributed execution of the TensorFlow graph is deciding which device to place a node on. While the paper does explain the algorithm used in section 3.2.1 I wish they had gone into more details and talked about what graph placement algorithms didn’t work, details about the greedy heuristic used, etc.

Sections 5, 6, and 7 were my favorite portions of the paper. Section 5 dives into some of the performance optimizations used in TensorFlow. It would have been awesome if the authors had given more details about the scheduling algorithm used to minimize memory and network bandwidth consumption. I would have also liked to know what other scheduling optimizations were used in TensorFlow as I find scheduling algorithms very interesting.

Section 6 talks about the experience of porting the Inception model over to TensorFlow. While the strategies mentioned in this section are specific to machine learning systems, I feel that some of them can be tweaked a little bit to be generally applicable to all software systems. For instance

“Start small and scale up” (strategy #2)

is directly applicable to any software system. Similarly,

“Make a single machine implementation match before debugging a distributed implementation” (strategy #4)

Can be rephrased as

“Make a single machine implementation work before debugging a distributed implementation”

and be generally applicable to building distributed systems.

Section 7 explains how TensorFlow can be used to speed up stochastic gradient descent (SGD). Again, while the idioms presented in this section are used to speed up SGD, I feel that they are general purpose enough where they can be applied to other algorithms/systems as well. The diagrams in this section are amazing and do a great job of illustrating the differences between the various parallelism and concurrency idioms.

EEG, the internal performance tool mentioned in the paper, sounds very interesting. While it is probably not in the scope of a paper that focuses on TensorFlow I’d love to learn more about EEG. It seems like a very powerful tool and could probably be extended to work with other systems as well.

The paper ends with a survey of related systems. This section proved to be a valuable source for finding new AI/ML and systems papers to read.

I loved this paper.