Hello! I studied Computer Science, now I live in Bath! I write code, design games, and occasionally tweet. Why not subscribe?

Posts about University

My Graduation

Posted July 19, 2018. Photos, University. 172 words.

My Graduation Ceremony

Four years later, and my time at university has come to a close. I have earned my Masters. My time here is at a close. It is a bitter-sweet day.

After The Ceremony A Group Of My Friends The Reception

After The Ceremony everyone went outside for a year photo. We took photos of Jet, LLoyd, Daniel, Me, Andy, and Edmund. And finally enjoyed the reception.

Third Year Project

Posted May 20, 2017. Dissertation, Javascript, University, Web. 40 words.

It’s done, it’s over! Months in the making, my dissertation is finished an available from lect.me. My advice for future students, is to start early. Projects like these always take longer then you expect.

Lect.me

Designing Games with Unity

Posted May 19, 2017. Csharp, Games, Innovation, Unity, University. 487 words.

Having previously created games in my spare time and in competitions, I chose to team up with three different partners to create games focusing on gameplay, narrative experiences, and innovative technology using Unity. It was hard, took a lot of work, but in the end it was one of the most satisfying modules I ever took at University. Shout out to Rikki Prince, Dave Millard, and Tom for running such and excellent module.

Planet Deathmatch

Planet Deathmatch

A fast paced, Quake inspired, local multi-player, little planet deathmatch infinite arena shooter. Hone your skills, then compete against your friends to see who can dominate the playing field. Supports up to 4 player split-screen, bring an Xbox controller. A student game created at the University of Southampton by Matthew Consterdine and Ollie Steptoe.

Featuring a number of classic weapons:

  • Shotgun: The short to medium range wild card, capable of one shotting your target, or missing entirely.
  • Launcher: Fires explosive rockets, knocking back all the enemies in your way. Just be careful not to get caught in the blast.
  • Pistol: Are your opponents not on fire? Well, that’s where the pistol comes in, it fires incendiary rounds igniting targets.
  • Axe: A visceral weapon that can end your opponent in a couple of hits.

Well, what are you waiting for? Play today!


Littlest Billy-Goat

Littlest Billy-Goat

A fully narrated re-telling of the fairy tale classic. Single player, play with a mouse/keyboard or Xbox 360 controller. A student game created at the University of Southampton by Matthew Consterdine and Jeff Tomband. Download and play.


Let it burn!

Let it burn!

Using your flame-thrower, wrack up points and burn the forest down. Single player, play with a mouse/keyboard or Xbox 360 controller. A student game created at the University of Southampton during the Southampton Code Dojo. Burn down everything!.


Last the Night

Last the Night

Last The Night is a procedurally generated first person survival game in which the player fights for their life after having crash landed on a mysterious, unknown planet. Armed with only a pistol, the player must fight off the various monsters inhabiting the planet, and only once the sun rises will they be safe.

With seed based world generation, there are literally millions of planets to explore with no two being the same, and with the addition of Easy, Medium and Hard difficulties, advanced players can challenge themselves whilst beginners can get a feel for the game. Last The Night features 17 different types of monsters, keeping the player guessing at all times.

A student game created at the University of Southampton by Matthew Consterdine and Ed Baker. Do you think you’re brave enough to last the night?

Machine Learning with MATLAB

Posted November 24, 2016. Machine-learning, Matlab, University. 4180 words.

I decided to investigate Machine Learning using MATLAB.

Posterior Probability

Posterior Probability 1Posterior Probability 2Posterior Probability 3

To compute the posterior probability, I started by defining the following two Gaussian distributions, they have different means and covariance matrices.

Using the definitions, I iterated over a N×N matrix, calculating the posterior probability of being in each class, with the function mvnpdf(x, m, C); To display it I chose to use a mesh because with a high enough resolution, a mesh allows you to see the pattern in the plane, and also look visually interesting.

Finally, I plotted the mesh and rotated it to help visualize the class boundary. You can clearly see that the boundary is quadratic, with a sigmodal gradient.


Classification using a Feedforward Neural Network

Classification using a Feedforward Neural Network 1Classification using a Feedforward Neural Network 2Classification using a Feedforward Neural Network 3

Next, I generated 200 samples with the definitions and the function mvnrnd(m, C, N);, finally partitioning it half, into training and testing sets. With the first of the sets, I trained a feedforward neural network with 10 hidden nodes; with the second, I tested the trained neural net, and got the following errors:

  • Normalized mean training error:
  • Normalized mean testing error:

These values are both small, and as the testing error is marginally larger than the training error, to be expected. This shows that the neural network has accurately classified the data.

Classification using a Feedforward Neural Network 4Classification using a Feedforward Neural Network 5Classification using a Feedforward Neural Network 6

I compared the neural net contour (At 0.5) to both a linear and quadratic Bayes’ optimal class boundary. It is remarkable how significantly better Bayes’ quadratic boundary is. I blame both the low sample size, and the low number of hidden nodes. For comparison, I have also included Bayes’ linear boundary, it isn’t that bade, but still pales in comparison to the quadratic boundary.

To visualize, I plotted the neural net probability mesh. It is interesting how noisy the mesh is, when compared to the Bayesian boundary.

Classification using a Feedforward Neural Network 7Classification using a Feedforward Neural Network 8Classification using a Feedforward Neural Network 9

Next, I increased the number of hidden nodes from 10, to 20, and to 50. As I increased the number of nodes I noticed that the boundary became more complex, and the error rate increased. This is because the mode nodes I added, the more I over-fitted the network. This shows that it’s incredibly important to choose the network size wisely; it’s easy to go to big!

After looking at the results, I would want to pick somewhere around 5-20 nodes for this problem. I might also train it for longer.

 Training ErrorTesting Error
10 Nodes
20 Nodes
50 Nodes

Macky-Glass Predictions

I was set the task of first generating a number of samples from the Mackey-Glass chaotic time series, then using these to train and try to predict their future values using a neural net.

Mackey-Glass is calculated with the equation:

Macky-Glass Predictions

For the samples, I visited Mathworks file exchange, and downloaded a copy of Marco Cococcioni’s Mackey-Glass time series generator: https://mathworks.com/matlabcentral/fileexchange/24390

I took the code, and adjusted it to generate samples, changing the delta from 0.1 to 1. If I left the delta at 0.1, the neural network predicted what was essentially random noise between -5 and +5. I suspect this was due to the network not getting enough information about the curve, the values given were too similar. You can see how crazy the output is in the bottom graph.

Next, I split the samples into a training set of 1500 samples, and a testing set of 500 samples. This was done with . I created a linear predictor and a feedforward neural network to look at how accurate the predictions were one step ahead.

Macky-Glass Predictions Error 1Macky-Glass Predictions Error 2Macky-Glass Predictions Error 3
  • Normalized mean linear error:
  • Normalized mean neural error:

This shows that the neural network is already more accurate, a single point ahead. If you continue, feeding back predicted outputs, sustained oscillations are not only possible, the neural net accurately predicts values at least 1500 in the future.

In the second and third graphs, you can notice the error growing very slowly, however even at 3000, the error is only 0.138


Financial Time Series Prediction

Using the FTSE index from finance.yahoo.com, I created a neural net predictor capable of predicting tomorrows FTSE index value from the last 20 days of data. To keep my model simpler and not overfitted, I decided to use just the closing value, as other columns wouldn’t really affect the predictions, and just serve to overcomplicate the model.

Financial Time Series Prediction 1

Feeding the last 20 days into the neural net produces relatively accurate predictions, however some days there is a significant difference. This is likely due to the limited amount of data, and simplicity of the model. It’s worth taking into account that the stock market is much more random and unpredictable than Mackey-Glass.

Financial Time Series Prediction 2

Next I added the closing volume to the neural net inputs, and plotted the predictions it made. Looking at the second graph, it’s making different predictions, which from a cursory glance, look a little more inline.

Financial Time Series Prediction 3

However, I wasn’t sure so I plotted them on the same axis, and, nothing really. It just looks a mess. Plotting the different errors again gives nothing but a noisy, similar mess. Finally, I calculated the total area, the area under the graph and got:

Financial Time Series Prediction 4

  • Normalized close error:
  • Normalized close+volume error:

This is nothing, a different of 0.011×10^5 is nothing when you are sampling 1000 points. It works out to an average difference of 1.131, or 0.059%.

From this I, can conclude that the volume of trades has little to no effect on the closing price, at least when my neural network is concerned. All that really matters is the previous closing values.

Overall, there is certainly an opportunity to make money in the stock market, however using the model above, I wouldn’t really want to make big bets. With better models and more data, you could produce more accurate predictions, but you still must contest with the randomness of the market.

I suggest further research before betting bit.

Continue Reading...

Aqua, an imperative language, for manipulating infinite streams.

Posted April 28, 2016. Languages, Ocaml, University. 1503 words.

This is the user manual for the Aqua programming language created as part of Programming Languages and Concepts. Visit the project on Github.

Aqua is a C­like imperative language, for manipulating infinite streams. Statements are somewhat optionally terminated with semicolons, and supports both block ( /* ... */) and line comments ( // ...).Curly brackets are used optionally to extend scope. Example code can be found in the Appendices.

Before continuing, it’s helpful to familiarise yourself with Extended BNF. Special sequences are used to escape.

Continue Reading...

Solving Block World Search

Posted November 26, 2015. Algorithms, Java, University. 1866 words.

Block world is a simple 2D sliding puzzle game taking place on a finite rectangular grid. You manipulate the world by swapping an agent (In this case the character: ☺) with an adjacent tile. There are up to 4 possible moves that can be taken from any tile. As you can imagine, with plain tree search the problem quickly scales to impossibility for each of the blind searches.

It is very similar to the 8/15 puzzles, just with fewer pieces, meaning it’s simpler for the algorithms to solve. It’s unlikely any of my blind searches could solve a well shuffled tile puzzle with unique pieces, but I suspect my A* algorithm could. However, before doing so I would want to spend time improving my Manhattan distance heuristic, so it gave more accurate results over a larger range.

I decided to use Java to solve this problem, as I’m familiar with it and it has a rich standard library containing Queue, Stack, and PriorityQueue. These collections are vital to implementing the 4 search methods. You can implement the different searches differently, but the data structures I listed just deal with everything for you.

Continue Reading...

Network Penetration Testing

Posted October 29, 2015. Cybersecurity, Networks, University, Web. 1581 words.

Using Nmap, I was tasked with scanning an IP range, to evaluate and report vulnerabilities.

Continue Reading...

Measuring IPv4 vs. IPv6 Performance

Posted October 19, 2015. Bash, Networks, University, Web. 4195 words.

I produced a series of bash scripts to automate the process of pinging the list of websites. I choose bash as it is trivial to pipe the output from ping into various other command line programs such as: sed, gawk and wget. As it was completely automated I decided to start early and just let it run. In total I pinged the top 100,000 websites up to 100 times each using script.sh (See Appendix A), recording useful statistics.

Continue Reading...

Software Modelling and Design

Posted March 2, 2015. Modelling, University. 1185 words.

In this report, I model an auction system using the Unified Modelling Language.

Continue Reading...

Where did the Web come from?

Posted January 9, 2015. History, University, Web. 1543 words.

From an underground Swiss bunker to all around the world; the World Wide Web has transformed from an experiment in academic distribution to the massively interconnected strength that we know today. While the current Web is relatively new, it has not only revolutionised the world, but promises to continue as our society strives towards the Internet of Things.

This report will untangle not only the history of the World Wide Web, but its many predecessors: designed and implemented, successful and not. It shall be accomplished by studying some of the attempts at webs from the past century; starting with the Mundaneum and ending with the current Web.

As the report approaches the end, it will look at the future of Web. Devices are becoming connected with smart devices such as phones, televisions and even thermostats sharing data with themselves, and their manufacturers. Privacy has been and will continue to be an important issue as greater amounts of data is shared.

Past

For millennia, libraries and educational institutions have provided people limited access to both culture and knowledge of the time 1. Until advent of the Chinese movable type, and more importantly, the mechanical Gutenberg printing press, duplicating and sharing knowledge was both laborious and time consuming. The press revolutionised the distribution of knowledge with 20 million books printed by the dawn of the 16th Century 2.

Knowledge was spreading faster than ever before, but the concept of near instant communication was still centuries away. Cooke’s and Wheatstone’s 1837 electric telegraph, the first working electric telegraph, spread from railways to postal offices 3. Finally, instant long distance communication was possible, ushering us into the age of connected webs.

In the past hundred years, at least a dozen different webs were designed. Once thriving, most have now fallen into disuse. While World Wide Web lives on, stronger than ever, the past clings to life. In most of the world, older technology is slowly phased out in favour of the new, often built as a service residing on the World Wide Web. 


Historical Attempts at Webs

The World Wide Web didn’t just appear out of nowhere. Like all great inventions it was inspired by what came before, building upon the shoulders of giants.

Mundaneum

Created by Paul Otlet, a Belgian Lawyer, the Mundaneum was a collection of fifteen million library cards pointing towards their million documents 4. To use it, you called and asked about a topic. Next, they would go to the cards and relay to you information about the topic, and give you suggestions on what else to research next. The Mundaneum was both the Google, and the Wikipedia of the early 20th Century.

Memex

Memex was a hypothetical precursor to hypertext designed to “supplement […] one’s memory” 5. Vannevar Bush envisioned a mechanized microfilm bookmarked system allowing the user to easily find information. Unlike HTML, linking was never a focus of the design nor could you connect to any external source to obtain additional information.

CeeFax and TeleText

After 38 years of broadcasting, the UK digital switch-over ended the run of both Ceefax and Teletext 6. Primitive even by the late 1990’s, it utilised part of the channels bandwidth to display digital news on an analogue TV. In its peak, over a third of the UK entered page numbers to read split news stories, gaze at pixel art weather and check the sports 7.

HyperCard

Designed around a stack of cards 8, HyperCard allowed the user to navigate or search through them. Cards contained useful information retrieved from an inbuilt database and, depending on the card, be interactive (HyperCard was programmed with the HyperTalk language). Exclusive to classic Apple Macs, HyperCard never reached mass popularity.

CompuServ & America OnLine

In 1980s America, the dominant online service was CompuServe. For a monthly fee users could browse their “walled garden”, read newspapers 9, shop and send 60 emails a month 10. After Competing for a decade, AOL bought CompuServe in 1998.

A relic to most, AOL retains 2.3 million dialup customers 11. During the 1990’s the company bombarded the public with $300 Million worth of trial CDs 12, inviting them to join. Designed for novices, it included AOL services such as channels, email and games. Access to the World Wide Web and UseNet was eventually added to the service.


The World Wide Web

Built on top of TCP/IP and decentralised with a unique identifier for every resource to promote linking between pages. The World Wide Web was created by Sir Tim Berners-Lee whilst working at CERN to “record random associations between arbitrary objects”, and present them in an easy to understand, language and encoding independent way 13. Prior, incompatible technologies and lack of standardisation hindered geographically separate entities from collaborating.

Compared with today, the early Web consisted of mostly static web-sites. CERN httpd, the world’s first web-server 14, originally fetched static files to send to the receiver. These files consisted of HTML, images and assorted documents. Hyper Text Mark-up Language, an extension of SGML 15, defines the structure of web-pages. The anchor tag, not only links web-pages and files together, but does so in a scalable, distributed way; this design let the Web scale exponentially.

Over the next two decades, the usage of the internet skyrocketed - primarily in developed countries. While in more recent years adoption elsewhere is increasing, it continues to lag behind the developed world.

Advances in the Web, such as the development of JavaScript at Netscape, (Originally designed in 10 days) 16 has lead us to the interactive, dynamic Web we know today.


Present and Future

Today virtually every part of our society has both adopted and adapted the Web to fit their needs. The Web originally was a system designed purely for spreading academic information. Today users from nearly any corner of the globe browse online stores, news organisations, branches of the government, content producers - the list goes on and on.

The World Wide Web has changed, and it has changed us. Never before have we been able to effortlessly communicate with anyone on the planet. The internet connects us all, via switches, routers and buried wire. This connection is so powerful, that within a third of a second, a packet can be delivered from the United Kingdom to New Zealand 17.

This connectedness that the internet brings, continues to spread the Web and help to create the future. We are researching and developing concepts including the Internet of Things, today. Dreams including automated homes and self-driving cars; and concerns over privacy and redundancies continue to grow.

I don’t know what the future will bring, but I sure am looking forward to finding out.


References

  1. M. Harris, History of Libraries of the Western World, Scarecrow Press, 2012. 

  2. L. Febvre and H.-J. Martin, The Coming of the Book: The Impact of Printing 1450-1800, London: New Left Books, 1976. 

  3. G. Hubbard, Cooke and Wheatstone: And the Invention of the Electric Telegraph, Routledge, 1965. 

  4. P. L. Carr, Where did the Web come from?, Southampton University, 2014. 

  5. L. Manovich, “As We May Think,” The New Media Reader, p. 35, The MIT Press. 

  6. M. Brown, “Teletext Museum”: http://teletext.mb21.co.uk/ [Accessed 3 Jan 2015] 

  7. M. Engel, “Ceefax: A love letter,” 18 Apr 2012: http://bbc.co.uk/news/magazine-17745100 [Accessed 28 Dec 2014] 

  8. M. L. Sandra V. Turner, HyperCard: a tool for learning, Wadsworth Pub. Co., 1994. 

  9. S. Newman, “Steve Newman’s report on CompuServe’s newspaper addition,” 1981: https://youtube.com/watch?v=5WCTn4FljUQ [Accessed 4 Jan 2015] 

  10. L. Northrup, “CompuServe In 1994: Here, You’ll Never Outgrow 60 E-Mails Per Month,” Consumerist, 5 Sep 2014: http://consumerist.com/2014/09/05/compuserve-in-1994-here-youll-never-outgrow-60-e-mails-per-month/ [Accessed 6 Jan 2015] 

  11. V. Kopytoff, “For AOL dial-up subscribers, it’s life in the slow lane,” Fortune, 11 Dec 2014: http://fortune.com/2014/12/11/aol-dialup-subscribers/ [Accessed 6 Jan 2015] 

  12. J. Brant, “How much did it cost AOL to distribute all those CDs back in the 1990s?,” Quora, 28 Dec 2010: https://quora.com/How-much-did-it-cost-AOL-to-distribute-all-those-CDs-back-in-the-1990s/answer/Jan-Brandt [Accessed 6 Jan 2015] 

  13. S. T. B. Lee, “WWW - Past, Present and Future,” IEEE, vol. October, pp. 69-77, 1996. 

  14. CERN, “Change History for httpd,” CERN, 15 Jul 1996: http://w3.org/Daemon/Features.html [Accessed 7 Jan 2015] 

  15. A. W. Longman, “A History of HTML,” w3, 1998: http://w3.org/People/Raggett/book4/ch02.html [Accessed 7 Jan 2015] 

  16. C. Severance, “JavaScript: Designing a Language in 10 Days,” Computer, vol. 45, no. 2, pp. 7-8, 2012. 

  17. Verizon, “IP Latency Statistics,” Verizon, 2015: http://verizonenterprise.com/about/network/latency/ [Accessed 7 Jan 2015] 

This Week I Start University

Posted October 1, 2014. Photos, University. 33 words.

This week I start studying MEng Computer Science at the University of Southampton. A new adventure begins! Image courtesy of Wessex Scene.

University of Southampton Green

StatCounter Pixel