Sunday, February 22, 2015

Creating a GIT Repo from XC




While at blueprint this weekend my team and I spent a good few hours learning how to connect to github from xcode. To simplify your lives I have explained how to do so below:

  1. Create a new project in xcode and make it a single view application, make sure to check to have a local git repo.



  2. Go to preferences --> accounts and add repository. 


  3. Go to git hub and create a new repo or use an existing repo. Get the HTTPS clone url and paste it as the repository address. 


  4. Enter your git hub account info 
  5. Go to source control and click remotes



  6. add a remote, make sure its called "origin", and paste your address to the git repo 

  7. Edit your project and go to source control and click commit, then select your remote.


  8. You will see your new repo in GitHUB!
































Tuesday, February 17, 2015

HSHacks | First Hackathon Reflection

Hey all,

Last weekend I attended HSHacks, my first ever hackathon.

A hackathon is an interesting invention that is kind of hard to explain. It is basically and event where many "hackers" gather together and "hack" some idea that they have. A "hack" can include anything from a hardware project to a mobile application. Some hackathons have themes, where competitors need to create a product that solves a specific problem or falls into a specific category etc. Hackathons usually take place over a weekend and have many fun workshops and events that you can go to to learn new things and use new products. HSHacks was a 24 hour event and was truly the best first time experience I could have asked for.

I really enjoyed HSHacks for a variety of reasons, but I truly loved the atmosphere. I loved how excited and lively everyone was. Though, I usually program in quiet spaces I found coding in such a noisy and crowded place quite therapeutic. Unfortunately, after 2 am a dance party ensued in my work area and I had to uproot my setup in search of quieter terrain. There were a few individuals who were not very supportive of the other hackers and were a bit aggressive for my taste, but at the end of the day everyone's hacks were great and I learned so much. I got to test out an oculus and meet many people who have started businesses like myself.

Taking selfies with the crowd and t-shirt cannon antics during the opening ceremony. 

I programmed an iOS app, called WingIt, that analyzes the strokes you drew on the screen while drawing and recognizes the curves and lines, providing feedback as you draw. The link to my challengepost submission is here. I wrote the entire app in swift and ran into many bugs, but as always I worked through them. For me, if a bug gives me a lot of grief I just get more and more determined to fix it. So when it finally works I am ecstatic.

Celebrating the little victories that made up the weekend.

A few of my friends from school where there and we were all working on our own projects and helping each other. The energy and passion around me was infectious and when I returned home I was fired up to keep going.

My friends Nikhil, Zabin, Sriram and myself running on a whopping two hours of sleep.

The challenge for me during Hackathons is mental endurance. Just like running a marathon, one must train diligently to hack for such extended amounts of time with so little sleep. Especially at the collegiate level, where hackathons can extend for 36 hours. By the time I reach the collegiate level I want to be able to create involved hacks without getting too tired or mentally exhausted.

I can't wait to go to more of these events and create more involved hacks. I am going to the blueprint hackathon at MIT this weekend, I will keep you posted on my experience :)

Bye for now,

Ashwini 

Friday, February 6, 2015

Registering Locations | Swift


Here is a very helpful video I found about how to register a user's location using the Swift programming language in xCode.



Thursday, February 5, 2015

Setting up a new project | Xcode

I just started programming my own iOS apps this year! Here is how you set up your first project in xCode


1. Open xCode :) If you do not have it already downloaded, you should get it from the app store

2. Create a new project


3. Select a "Single View Application"


4. Title your app and select Swift as the programming language


5. Disable size classes and select yes when it asks for confirmation



6. You now have set up your project. Code away :) If you run it you will see a simple blank app in the simulator!



Wednesday, February 4, 2015

Rosenblatt's Multi-layer Perceptron


Inspired by biological Neural Networks, in particular the functionality of the brain, artificial Neural Networks are a branch of machine learning that uses statistical algorithms to predict the outcomes of functions with multiple known inputs. Neural Networks are usually represented as a system of interconnected “neurons”, which are capable of pattern recognition and computing values. The formative years of Artificial Neural Networks were from 1943 to 1958, where many large and influential discoveries were made. In 1958, a famous researcher named Frank Rosenblatt proposed that a “perceptron” could be used as the first model for supervised machine learning.

A perceptron is the simplest form of a neural network and is used to classify patterns that are on opposite sides of a hyperplane, also known as linearly separable patterns. A perceptron is made of a single neuron with customizable synaptic weights. With his perceptron, Rosenblatt proved the perceptron convergence theorem which stated that if the vectors used to train the perceptron were taken from two linearly separable classes, then the algorithm will converge and positions the decision surface in the form of a hyperplane between two classes.

Rosenblatt made many public statements about the perceptron and it’s ability to learn, thus sending the Artificial Intelligence community into heated debate. It was later discovered that the single-layer perceptron could not be trained to recognize many classes or patterns. This stalled the field of neural network research considerably before people realized that a “feed-forward neural network”, also known as a “multi-layer perceptron” that had two or more layers, had much more processing power than previous models. It was then shown that multi-layer perceptrons could solve even problems that were not linearly separable like XOR. Single-layer perceptrons were only able to solve linearly separable problems like AND and OR, and in 1969 two researchers named Marvin Minsky and Seymour Papert wrote a book titled Perceptrons that proved that single-layer perceptrons would never be able to learn non-linearly separable patterns like XOR.

The multi-layer perceptron computes a single output from multiple inputs by forming a linear combination (propagation rule) according to its input weights and then putting the output through a linear activation function. This can be mathematically represented by:


Where “w” represents the value of the weights, “x” is the value if the inputs, β is the bias, and λ is the activation function. Although, in Rosenblatt’s original multi-layer perceptron a Heaveside step function was used as the activation function, in recent days a logistic sigmoid function is used as it is mathematically convenient and is close to the origin and then moves quickly when leaving the origin.

A single-layer perceptron is not very useful, not only for the reasons stated previously but also because of it’s limited mapping ability. Also, no matter what activation function is used, the single-layer perceptron will always represent a ridge-like function. Below is a graphic representation of a single-layer perceptron and the one below it is a multi-layer perceptron:



Multi-layer perceptrons require eight basic elements. A set of processing units, represented by the circles in the diagram above, a state of activation (is the unit initially on or off?), and an output function for each unit, which is the state of activation after the activation rule has been applied. They also need a pattern of connectivity as represented by the fully connected model above, a propagation rule which is how information is brought into activations to set the activation state initially, an activation rule which understands and manipulates the outputs from the propagation rule, a learning rule which is how we will train the network, and finally an environment to run it on.

Multi-layer perceptrons are usually used in “supervised learning” environments, where there is a training set of input-output pairs that the network must learn to model and understand. In the perceptron above the training is adapting the weights and biases accordingly to achieve the desired outcome. To solve the supervised learning issue in the multi-layer perceptron a “back propagation algorithm” can be used. A back propagation algorithm has two parts, the forward pass and the backward pass. The forward pass, the predicted corresponding outputs to the inputs provided are passed through an equation like the activation function. Then in the backward pass, partial derivatives of the function with respect to various parameters are sent back through the network. This process is then iterated until the weights converge and the error is minimized.

The Multi-layer perceptron network can also be used for learning that is unsupervised. This can be done using an “auto-associative structure”. Setting the same values for both the inputs and outputs of the network creates an auto-associative structure. The resulting sources will come from the values of the hidden layer’s units. The network must have at minimum three hidden layers for any conclusion to be made and it is an incredibly computationally intensive process.


Rosenblatt’s discovery of the perceptron revolutionized the world of Artificial Neural Networks and paved the path for many discoveries in the field and created countless jobs. He laid the foundation for many artificial intelligence and pattern recognition software actively used today by large companies like Google and Facebook.

Helpful Links:

Tuesday, February 3, 2015

Artificial Neural Networks | Introduction

Hello,

My Neural Networks class has been pretty exciting! It marks the last computer science class I will take as a high schooler and it is taught by the same teacher who taught me freshman physics :) What is cool about this class is that we will be focusing on one kind of network (the multi-layer perceptron) and will be learning how to do cool things with it. I like that we will be able to actually create programs and will be applying the theory we learn in class. I will keep you posted!

Ashwini 

Monday, February 2, 2015

I'm Back! | Computer Architecture Reflection

Hi Friends,

I apologize for my absence these past few months. 

I was in the heat of college applications and as some of you may know, they can kind of consume you. 

I learned quite a bit in my Computer Architecture class, but I did not get a chance to document it all for you. The class was a semester course and we made a Hz processor with wires and chips. 

Apparently, my class was particularly unique. A few of my friends who are in college have told me that many college level courses are more theory based and do not emphasize hands-on learning. 

Our class had a few lectures on the theory behind what we were building, but I found that I learned quite a bit of it on my own through experimentation. I learned many valuable skills including, proper form for documentation, how to draw schematics, and shortcuts in the seemingly endless process of debugging. My professor required us to wire up everything we made - logic without execution was not accepted. 

Initially I found it tedious and frustrating when my breadboard would begin smoking or I had a short, but towards the end of the semester I learned how valuable knowing how to create what I designed truly was. 

I never want to go out into the world or work in a team without knowing how to execute my vision. That would be catastrophic.

My professor and my computer architecture class taught me to value every step of the learning process. I found myself enjoying every day, including test days. It was just very rewarding and was a truly healthy environment. 

In addition, there has been some discussion amongst my peers at school over the disproportionate number of females to males in advanced computer science courses. In fact there was an entire article about it in my school's publication, Wingspan. Many people have warned me that being a woman in a male dominated field is going to be hard and that I will be discouraged a majority of the time, but I am not someone who walks away from a challenge. I love computer science and I will do what I am passionate about. 

As a young woman, I am incredibly supportive of women entering engineering fields. But, I must say that my Computer Architecture class had an equal number of girls and boys. We were equally represented. Maybe this was an anomaly but in a class of 10 students, exactly 5 of us were girls. Even in the beginning of the term many people dropped the class, but none of them were girls. We were all friends and we worked together, helping each other debug problems and understand concepts. It was never a matter of gender, we looked at each other as peers and we were all there to learn.

I am glad to say that it is possible for both girls and boys to work together and support one another in an advanced course. I saw it myself. I hope that this will not be the last time I witness this and I hope that everyone remains supportive of each other and excited to learn. 

I really enjoyed my Computer Architecture class and I look forward to my Neural Networks class next semester. 

Unit next time,

Ashwini 

SUBSCRIBE

Pages - Menu

Pages

top social

SUBSCRIBE

About