Thursday 31 March 2016

Integrating Unity app with Twitter

In the last few days we've been getting all social-media-savvy, writing some simple PHP scripts to interface with Twitter. Using the TwitterAPIExchange library we scraped Twitter for recent tweets with a specific hashtag.

Originally, we planned on using a Unity Assest to query Twitter directly, but quickly decided against it. The Twitter API specification says that not more than 180  requests can be made in any one hour; that's less than three a minute. While that's not actually all that bad, if we ended up with, say, 10 people all running the same app - since each request is linked to a specific user account for authentication - we could quickly use up all our requests before the hour was up.

We eventually decided on a web-server-based query; our PHP script would query Twitter at not more than one request per minute. The Unity app would hit our server with a request for the latest tweets. This means that there's no danger of upsetting the Twitter  gods and getting barred for exceeding our bandwidth (we did this a few years back, when first playing with the oAuth API on Twitter - getting banned makes further app development quite tricky!)

By using our own server to cache tweets, we're also able to add in things like filtering, removing or replacing inappropriate content and so on. And it also means that, should we want to use an app-to-Twitter-like interface and time in the future, we can run it completely standalone, or maybe plug it into a different social media platform (the exact same app could read back posts on a Facebook timeline, for example, since it doesn't actually connect to the end point where the messages are created).

Our PHP for scraping tweets looked something like this

<?php
ini_set('display_errors', 1);

// get the last time this script hit twitter.
// make sure we only do it one per minute maximum otherwise we
// might get barred from Twitter altogether!

$run_twitter=0;

if(is_file("/var/tmp/last_ran.r")){
      // get the last time the twitter website was polled
      $f="/var/tmp/last_ran.r";
      $file = fopen($f,"r");
      //fread($file,filesize($f));
      $ldt=fgets($file);
      fclose($file);

      $dt=date('YmdHi');

      if($dt==$ldt){
            $f="/var/tmp/last_tweets.t";
            $file = fopen($f,"r");
            $s=fread($file,filesize($f));
            fclose($file);
            echo($s);
      }else{
            $run_twitter=1;
      }

}else{
      $run_twitter=1;
}


if($run_twitter==1){

      require_once('TwitterAPIExchange.php');

$oauthAccessToken = 'access-token-here';
$oauthAccessTokenSecret = 'access-secret-here';
$oauthConsumerKey = 'consumer-key-here';
$oauthConsumerSecret = 'consumer-secret-here';

      $hashtag="nerdclub";
      if(isset($_GET['hashtag'])){ $hashtag=trim($_GET['hashtag']); }
     
      $settings = array(
      'oauth_access_token' => $oauthAccessToken,
      'oauth_access_token_secret' => $oauthAccessTokenSecret,
      'consumer_key' => $oauthConsumerKey,
      'consumer_secret' => $oauthConsumerSecret
      );
     
     
      $url = 'https://api.twitter.com/1.1/search/tweets.json';
      $requestMethod = 'GET';
      $getfield = '?q=#'.$hashtag.'&result_type=recent';
     
      // Perform the request
      $twitter = new TwitterAPIExchange($settings);
      $json= $twitter->setGetfield($getfield)
      ->buildOauth($url, $requestMethod)
      ->performRequest();
     

      $twits="";
      $response = json_decode($json);

      foreach($response->statuses as $tweet) {
      $twit = "[{$tweet->user->screen_name}] {$tweet->text} ";
      echo($twit);
      $twits.=$twit;
      }

      // now write the current time to the last ran file
      $file = fopen("/var/tmp/last_ran.r","w");
      fwrite($file,date('YmdHi'));
      fclose($file);

      // and write the last lot of tweets out to disk
      $file2 = fopen("/var/tmp/last_tweets.t","w");
      fwrite($file2, $twits);
      fclose($file2);

}

?>
     

The PHP file writes the current date, in YmdHi (year, month, day, hour, minute) format to a temporary file on the server. Each time the script runs, it compares the current time to the time it last queried Twitter. If the two are the same, it does not fire off another request, instead returning cached results. If the dates are different, then we know we're only performing, at most, one request per minute, and so gets a fresh lot of data from Twitter.

With this in place, it was just a simple matter of building a script which would return text from our own web server, split it into pieces and display on the appropriate Unity GUI canvas object.


public string hashtag;
private string base_url = "http://www.nerdclub.co.uk/twitter/index.php?hashtag=";
private string twatterString;
private string[] tweets;
private string[] words;
private int tweetIndex;
private int wordIndex;
public Text txtMessage;            // link this to the tweet textbox in Unity editor
public float wordSpeed=0.2f;
public float delayBetweenTweets = 4f;
     
void Start () {
      Invoke ("startTwatter", 1f);
}
     
void startTwatter(){
      txtMessage.text = "";
      Invoke ("restartTweets", delayBetweenTweets);
}

void restartTweets(){
      if (hashtag.Length < 1) { hashtag = "nerdclub"; }
      string url = base_url + hashtag;
      Debug.Log (url);
      WWW www = new WWW(url);
      StartCoroutine(WaitForRequest(www));
}

void displayNextTweet(){
      // get the name of the person/account who sent the tweet
      // split the tweet using spaces to create words
      tweetIndex++;
      if (tweetIndex >= tweets.Length) {
            // end of tweets
            Debug.Log ("no more tweets");
            txtMessage.text = "";
            tweetIndex = -1;
            Invoke ("restartTweets", delayBetweenTweets);
      } else {
            wordIndex = 0;
            twatterString = tweets [tweetIndex];
            if (twatterString.Trim ().Length == 0) {
                  Invoke ("displayNextTweet", wordSpeed);
            } else {
                  //words = twatterString.Split (new string[] { " " }, System.StringSplitOptions.None);
                  words = twatterString.Split (new string[] { " ", " " }, StringSplitOptions.None);
                  Invoke ("nextWord", wordSpeed);
            }
      }
}

void nextWord(){
      if (wordIndex >= words.Length) {
            // end of tweet
            Debug.Log ("End of tweet");
            Invoke ("displayNextTweet", delayBetweenTweets);
      }else{
            // display this word
            Debug.Log(words[wordIndex]);
            txtMessage.text=words[wordIndex];
            Invoke ("nextWord", wordSpeed);
      }
      wordIndex++;
}

IEnumerator WaitForRequest(WWW www){
      yield return www;
      // check for errors
      if (www.error == null) {
            Debug.Log ("Response: " + www.text);
            twatterString = www.text;
            tweets = twatterString.Split(new string[] { "\r\n", "\n" }, StringSplitOptions.None);
            tweetIndex = -1;
            Invoke("displayNextTweet", delayBetweenTweets);
      } else {
            Debug.Log("WWW Error: "+ www.error);
      }
}


The result looks something like this:


Here we've changed the tag to #harveyandjohn - those crazy inventor types from Brighton who - like us - have been neglecting their Twitter presence, so it was easy to find recent tweets with a matching tag


Tuesday 29 March 2016

Raspberry Pi shield design

Sometimes I feel a bit like the late kid to a party. The one who's just got there after everyone else has already got blazing drunk/copped off/filled the kitchen sink with vomit. It's taken me a long time to appreciate there there may actually be a place for the  Raspberry Pi microcomputer. So while I'm just working out what this little box of tricks can actually do, and how to coax it into life using a cobbled mix of PHP and Python, everyone else has already been there, done that and moved on.

But in all honesty - much like my recent foray into the world of Arduino - I've used these platforms because someone else asked me to, not because I wanted to. I can't pretend to be a convert to R/Pi, but it has been fun seeing what's possible with a bit of "creative coding".

Anyway, I've been using R/Pi and Arduino together (double-shudder) to create hardware that can be controlled via a series of web pages on any smart phone/tablet.

Once the software was written, I needed a way of getting the two connected together without leaving a mass of wires and breadboards lying around the place. What I needed was a Raspberry Pi "shield".


I didn't really fancy making a double-sided board, and since the 0.1" pitch sockets would be mounted on the non-etched side of the board, I had to plan for the other stuff (Arduino, sockets for connecting other devices etc) to be surface-mounted. So the etched tracks would be "face-up" with the Arduino (ProMini) mounted on top.


As I didn't have any surface mount sockets for the Arduino, I chopped up some DIP sockets and bent the little legs out to the sides.


Then, placing the Arduino in the sockets, to ensure they remained the correct distance apart, held the whole thing in place and soldered the legs to the PCB.


After fitting a couple of extra sockets in a similar fashion (I couldn't use them "correctly" in the normal through-hole fashion, as putting them on the underside would cause them to clash with the surface of the Pi underneath) the "shield" was pretty much done.


The total thickness of the entire thing is less than 40mm. Less than 35mm if I take that rather unnecessary ICSP header off the Arduino.

The PCB was just drawn freehand in ExpressPCB. With a bit of planning, I'm sure the "shield" wouldn't need to be even as wide as the Raspberry Pi- making it sit nicely inside even the smallest of enclosures.

After weeks of being a bit of a code monkey, it's been a pleasant change to stain my fingernails brown and make my hands smell of ferric chloride again. Less "work" and more electronics stuff please!

Monday 28 March 2016

Controlling Arduino with a web page

Here's a little project I've been working on lately.
It's using a Raspberry Pi as a web server and a wifi access point, as well as talking to an Arduino over serial. It means we can use a web-based interface to change settings and parameters on the Arduino without exposing the user to nasty strings of hexadecimal.

It uses a right old clumsy mix of a LAMP stack on the Pi (well, an Apache webserver and PHP although there are plans to do a little bit of database logging too) and Python.

There's no nice way to get the web server talking to Python (or vice versa) so everything is done through exchanging temporary files on the server.

When a request comes in from the web-based app, our phone (or tablet or whatever) is talking to PHP. The PHP script creates a temporary file on the Pi (we created a small ram-disk of about 5Mb to put these temporary files on, so we're not continually writing to and deleting content on the actual SD card).


Then we have a Python script running as a daemon service.
It constantly looks for temporary files and if it finds one, parses the contents to see what has been requested. The script then takes action - where needed - and, if necessary, writes a response to another temporary file. The PHP script can then read these files and report the results back the web interface.

It's clunky and nasty.
But it works.
And is pretty robust.

So now we don't even need to build "apps" for controlling our connected microcontrollers (of course in the fullness of time, we'll be talking to a PIC over serial - we were asked to use Arduino as proof of concept for someone else to follow). We can just throw a Raspberry Pi onto a UART and use that for all manner of fancy graphical front-ends.,,,,


Windows 10 - still rubbish after a clean install

It seems like I'm not just that I only the only one having problems with Windows 10, but everyone else loves it! Steve got a whizzy new computer and - having been married to XP for nigh on 14 years or more and refusing to even look at another operating system - suddenly Windows 10 is the new big thing.

My laptop had got to the point where it was intolerably slow. Ten minutes after downloading a file and Windows Defender would still have hold of it, making it impossible to open, copy or move. At points throughout the day, the laptop would completely lock up for minutes at a time.

It was pretty much decided that my laptop was borked.
Or, at least, my Windows 10 install was.

So with a bootable USB version of Windows 10, I went for the nuclear option.
And at first it seemed to work. The laptop booted up much quicker than at had done since the days it first arrived at Nerd Towers with the shiny new Windows 8.0 operating system. Apps (really? apps? not applications, or programs?) launch quicker than they have done in a long while. And moving files about doesn't invoke Windows Defender for twenty minutes any more.

For the next few days, it was all about installing the software I'd forgotten I had.
Stuff I forgot I used quite regularly. Like Notepad++. And HeidiSQL. And lots of weird little programs - not just VB6!

But then, just as I got comfortable with the new Windows install, I noticed a few things had changed: my default scheme colour was green, not blue. That's ok. I quite like it. But there were some other things I didn't like quite so much.

Microsoft Edge (which is a half-decent browser, but it's no IE11) doesn't like loading local html files, and always tries to find a live internet connection, even if there isn't one (loading pages from a Raspberry Pi server is painfully slow on my Windows 10 machine).

The double-finger scroll option (using two fingers side-by-side on the trackpad to scroll a page or panel up/down) stops working intermittently. I hadn't realised how much I use this feature, until I had to keep using the mouse pointer to grab the scrollbar in frustration.

But the most annoying thing - on a device I'm trying to use to program apps for an interactive guitar - is the godawful sound. Previously, 50% volume was LOUD. Now, I've got to run the speakers at 100% just to be able to hear voices in a Youtube video - and even then turn down the radio otherwise it's difficult to make out what they're saying!

It's not an uncommon problem. It was reported back in Oct 2015 and is still not fixed. Worse than that - I found some old Realtek audio drivers and installed them, and did get a little "boost" to the volume. Then rebooted the machine. And Windows "upgraded" the drivers - to their own crappy quiet versions, and made the audio super-tinny and quiet again.

Honestly, Microsoft.
You should have  stopped at 7.
XP even.
And left us with computers that actually worked.

My old boss tried to compare computers over the years to cars.
Computers have got easier to use, and any old idiot can get on the internet with one (that wasn't always the case). Cars have got more and more safety features, and anyone with a laptop can diagnose faults when they go wrong. Things have got easier.

And at the same time, the experience- for people who know what they are doing - has got worse. Cars may be more fuel efficient and less likely to skid when you hit the brakes. But on a racetrack, in the hands of a knowledgeable driver, they're horrible to drive.
The same goes with computers - they're easier and more accessible, but also much more difficult to actually get them do anything the providers haven't allowed for (try running your own programs using Windows subclassing on Windows 10 - Defender goes beserk, even if it's fairly harmless stuff!)

Or, as my old boss put it - "with Windows 10 it's like they've made an automatic, put the brake on the dashboard and the bloody steering wheel off and left it in the boot". I know exactly how he feels!


Friday 18 March 2016

Guitar fretlights testing

Exciting times at Nerd Night tonight; after a few evenings of coding fresh firmware and app development in Unity, we managed to get a working demonstration of our guitar fretlights this evening.

In recent evenings, we've been able to select patterns from eeprom and display them across the neck using a rotary encoder and miniature LCD. Then with a bit of hacking, we managed to get the patterns to display in response to some serial/UART commands.

It took a little while to edit the firmware so that as well as displaying patterns we could get the device to change key. But once all these elements were in place, we decided to go for the big one.....

Here's a video showing a dedicated app, playing an mp3.
In the app, we can add "events" to our "timeline". As the events occur, we send messages over bluetooth serial to the device, which responds by changing the LED arrangements.

It's not perfect, but it's good enough to demonstrate that the idea works:


For those who are interested in such things, it's a backing track downloaded off Youtube and turned into an mp3 - some kind of blues backing in A. The first event sets the pentatonic scale to "channel one" (the "background" channel for want of a better term). The second event, shortly after, sets the "root notes" pattern on channel two (the foreground channel).

As the song progresses, listen out for the chord changes.
When the IV chord comes along, the original pentatonic scale, in A, remains on the "background" channel, but the root notes change, to match the chord that's currently playing. When the song returns to the I chord, the root notes change back. Similarly at the V chord, the root notes light up all of the E notes on the neck (in a I-IV-V progression in A, the chords are A, D and E).

Brightness on the LEDs has been turned right down.
And somewhere in the middle of the video, Jake says "fuckboots".
Thanks Jake.

Friday 11 March 2016

Raspberry Pi - I finally get it.

I've never really been a fan of the Raspberry Pi. Mostly because I don't understand it. Not that I don't understand Linux (but I don't) nor that I don't understand the R/Pi's favoured programming language Python (I do, but I don't like it). It's just that - for a long time - I never really understood what the point of the Raspberry Pi was.

About 95% of the projects online that involve a Pi are using vastly over-rated hardware to do relatively simple stuff. Read a temperature sensor and report it over the interwebs? An 8-bit micro and a wifi module would do that just as well. The Pi is massively over-spec for most of these kinds of projects.

For anything a little more adventurous, it's really not quite beefy enough. Real-time object tracking would be a great idea - but at 700Mhz it's not quite got the grunt-power to do actual real-time processing, without a noticeable (and slightly annoying) lag.

Whichever type of project it's used for, it always seemed to be an odd fit. Too beefy for simple stuff. And not beefy enough for complex stuff. Which meant that most projects for the Raspberry Pi are simply ports from other platforms, just for the sake of it, rather than because the Pi was best suited to the job.



And for that reason, I'd not really bothered with the Raspberry Pi.
I got one and played with it. I got a camera and the model B+ (and the Pi2 when it came out). I even ported some simply blinky LED stuff, just to see what it could do. But it was always just a fancy gadget with no real purpose. A few of the other nerds got one and made it do fancy stuff - but in all honesty, not really much more than you can do with an 8-bit micro and some clever coding skills!

But in recent weeks, we've been using the Raspberry Pi quite a bit. It's still a horrible little thing to program for, using Python. But it's actually been quite useful for one major use: creating a web interface for accessing hardware.

Until recently, we've been using either wifi or bluetooth to talk to our hardware - the ESP8266 and the BTLE4 modules are cheap enough (at just a couple of quid) to throw onto an 8-bit micro to create a gateway for an app to talk to it. But as easy as this is to do, it almost always results in us creating some kind of app to talk to the bluetooth/wifi module.



So, at last, we've found a real purpose for a Raspberry Pi.
Sticking a web server (Apache) on the Pi and installing php means we can create and serve web pages that any device - iOS, Android, Windows, Linux - anything with wifi and a web browser can access. And because the Pi has an accessible i/o port, we can route requests from the web server to turn pins on and off. And, as any good nerd knows, if you can flash an LED, you can make your hardware do just about anything!

So with our Pi set up as a wifi access point, we can simply connect, open a browser and use HTML and javascript to turn things on and off. While people have been doing this for years, it's the combination of the camera module, accessing the photos it creates over wifi and bundling the whole lot up into a web-based app using websockets (to reduce latency) that suddenly means we're looking at the Raspberry Pi in a whole new light!

Just using a web server and calling webpages and ajax functions is ok. But it's relatively slow (in computing terms). Using web sockets means we're able to get our response times down to almost as low as a regular "ping" round-trip; sub-20-milliseconds is quite feasible.

So here's how we set up our socket server:



The script websocket.py runs on the Raspberry Pi, responding to incoming web socket connections
and messages. At the minute, we're using it as a simple relay server, but it proves the idea quite nicely. Here's the web browser sending and receiving messages from the Raspberry Pi over web sockets.


The web socket server (Python) script:

import tornado.httpserver
import tornado.ioloop
import tornado.options
import tornado.web
import tornado.websocket

class WebSocketHandler(tornado.websocket.WebSocketHandler):
      def open(self):
            print "new connection"
            self.write_message("connected")

      def on_close(self):
            print "connection closed"

      def on_message(self, message):
            print "message received {}".format(message)
            self.write_message("message received: "+message)

      def check_origin(self, origin):
            return True

if __name__ == "__main__":
      tornado.options.parse_command_line()
      app = tornado.web.Application( handlers=[(r"/", WebSocketHandler)] )

      httpServer = tornado.httpserver.HTTPServer(app)
      httpServer.listen(1975)
      print "Listening on port 1975"
      tornado.ioloop.IOLoop.instance().start()

      # this is a blocking thread - nothing more to do here


And the HTML to talk to the web socket server:

<html>

<head>
   <script src="jq.mobile/jquery-2.2.1.min.js" >
</script>
</head>

<body>
<br/><br/>

<textarea id="log"></textarea><br/>
<input type="text" id="txtSend" ><input type="button" value=" send " onclick="sendData();" />

<script>
   var socket;
   function logIt(s){
      var t=$('#log').val();
      t=t+"\n"+s;
      $('#log').val(t);
   }

   function sendData(){
      var t=$('#txtSend').val();
      try {
         socket.send(t);
      } catch(exception){
         logIt("Error sending "+exception);
      }
   }

   if(window.WebSocket){
      socket = new WebSocket("ws://192.168.42.1:1975");
      socket.onopen = function(){
         logIt("Socket status "+socket.readystate+" (open)");
      }
      socket.onmessage = function(msg){
         logIt("Received ["+msg.data+"]");
      }
      socket.onclose = function(){
         logIt("That's all folks!");
      }

   }else{
      alert("You can't use this, bog off");
   }
</script>

</body>
</html>


This combination of web server to deliver fancy app-like performance, plus websockets to give serial-like communications speeds, means we can control our hardware with great looking interfaces without having to deploy and install native apps for each device type. Finally, a task for which the Raspberry Pi is perfectly suited!

Wednesday 9 March 2016

Setting up a Raspberry Pi(2) as a wifi Access Point

There are loads of tutorials all over the intertubes on how to do this, so there's no point repeating everything verbatim. Also, because of different combinations of hardware, no one guide can do justice to a complete step-by-step walkthrough.

We tried three or four online guides and at various points along the way, hit an obscure error message that didn't really mean an awful lot - so debugging and troubleshooting was difficult. Whenever we hit a problem, the best we could hope to do was reboot the Pi and start again, using someone else's guide!

But eventually, after following the Adafruit guide (almost) to the letter, but ignoring the "use rtl871xdrv" setting in hostapd.conf, we managed to create a wifi access point that various devices (Windows laptop, Android phone, Linux desktop etc) would connect to

Here's our raspberry pi responding to incoming connections to the wifi access point

At first things looked quite promising - the Pi showed up as a wifi access point, using the SSID we set, and would accept (and reject) connections based on the WPA/passphrase. Once we got to the "connect and test" stage, things didn't quite pan out.

On our Windows 10 machine, after "connecting" the IP address was way out of the range we'd set on the Pi. Trying to ping the Pi on it's fixed IP address resulted in a "General Failure" error message. The Android phone started the connection process, which appeared in the log on the Raspberry Pi, but was never given an IP address. After a short delay, the phone would give up and prompt us to forget the connection.

Although the Adafruit website suggested that the  problem was probably with  the dhcp server configuration, it took some hunting around until we found this guide at http://www.pi-point.co.uk/documentation/

The final step is to configure dnsmasq so you can obtain an IP address from your new Pi-Point. Edit your /etc/dnsmasq.conf file to look like this: 

# Never forward plain names (without a dot or domain part)
domain-needed

# Only listen for DHCP on wlan0
interface=wlan0

# create a domain if you want, comment it out otherwise
#domain=Pi-Point.co.uk

# Create a dhcp range on your /24 wlan0 network with 12 hour lease time
dhcp-range=192.168.1.5,192.168.1.254,255.255.255.0,12h

# Send an empty WPAD option. This may be REQUIRED to get windows 7 to behave.
#dhcp-option=252,"\n"


Remember to change the dhcp-range option for the network IP range you're using. If you're having problems with Windows7 machines then try uncommenting the last line. Restart dnsmasq by typing service dnsmasq restart for the configuration to take effect. Now you should be able to connect to your new Pi-Point and get a proper IP address from it. 


Although we're not running the pi-point software, we had a working wifi access point, so we just picked things up from there. We changed the ip range to match the one we used while following the Adafruit guide and - amazingly - the access point started working!



The connection process takes just seconds, each device is correctly given it's own IP address from the DHCP server running on the Raspberry Pi. Since we set up the Pi with a static IP address of 192.168.42.1, even the ping request from our Windows laptop worked!




So now we've got our AP working on a Pi, we're going to have to learn how to respond to incoming connection requests from apps running on our devices, and how to process them to make meaningful stuff happen. It's only a small step so far, but quite a significant one!



Tuesday 8 March 2016

CNC spindle upgrade

We've had our old desktop CNC collecting dust for quite a while now.
It's a beast of a machine (for a little desktop thing) and really sturdy, built from lumps of aluminium bar all bolted together.

But it does really get the use it deserves.
Partly because it's really slow.
And partly because it's quite noisy.
But mostly it's been because the dremel clone we were using for routing was knackered; and Dremels in general have too much run out to be used for anything other than simple, coarse shape carving.

So while we're waiting for a few controller boards to arrive, to upgrade the entire drive system to GRBL (it uses a nasty parallel interface and Mach3 as the controlling software) we went and ordered - and have since now fitted - a new spindle.


We took the old mounting plate off the z-axis - one of the nice things about the whole frame being bolted together is that it makes working on just parts of it really easy - and marked it out for drilling.


It was very tempting to just attack the aluminium with a household drill. But the new holes for the new spindle mount were going to be right on the very edges of the aluminium block, so it was important  to make the hoes truly vertical. This, of course, meant a trip to BuildBrighton hackspace to use the pillar drill.


The holes were drilled with a 5mm bit and tapped out with a 6mm tap-n-die set. From the photo it's clear how very near the edges of the block we had to drill.

An hours bus-ride later (after hopping on a wrong bus and ending up in the bowels of Brighton's suburbia) and the new CNC spindle was ready for fitting to the frame.


The controller board requires a dedicated 24V supply so we're going to have to cobble something together from an old PC supply if possible; otherwise it'll be another week or so delay while we wait for one of those to arrive too!


Sunday 6 March 2016

Learning multiple programming languages

Our (or rather, my personal) recent rant about Arduino attracted a few comments - some people mistaking our lack of enthusiasm for the Arduino platform as another PIC vs AVR argument (an argument with less going for it since Microchip acquired AVR). The problem with Arduino is not the AVR hardware (although it still isn't as robust as PIC) but the actual language. Sometimes it compiles incorrectly. Or maybe its programmers (probably me included) not using the language correctly. Or not understanding how the language works.

My personal biggest gripe with Arduino is the reliance (and encouragement for developers to become reliant) on libraries - many of which are of variable quality; code submitted by programmers who have learned another programming language, rather than understood the device that they're writing code for.

There's a world of difference between learning a programming language and learning how to code. It's a question often seen on the interwebs - "which language should I learn". Then there's a 500-comment discussion about why one language is superior to another.

Having recently got to grips with a Raspberry Pi after it sitting in a box a few years, and wanting to use simple face recognition in an art installation project, it's time for us to learn (yet) another language, Python.

Python is weird.

It's a weird language.
It uses white space and indenting as part of the language structure. It's horrible to work with. But it's just another language. The fact that you indent and "outdent" instead of using opening and closing squirly brackets doesn't mean it's unworkable (it just means it's horrible to use!)

But that's only because most of the nerds at nerd club are used to working in languages that look different. Our criticism of Python is not really that different to some of the criticism we've attracted on this blog for posting PIC source code in BASIC. Hey, Basic's not C. We get it. It doesn't use exactly the same words as C, so you can't just copy and paste the code into your favourite C compiler, or put it into an Arduino IDE and expect it to work.

And Python's a bit like that.
I can't just use phrases like s = string.substr(2,4)
But that doesn't mean Python can't split strings.
To do that in Python, you use s = string[2:6]

There you go, same result, just different words to achieve it. And that's what makes a question like "which language should I learn" pretty meaningless. Especially now we've all got access to the interweb and Wikipedia at our fingertips. You don't need to know the exact syntax of a programming language to use it effectively. But you do need how to code to get the most out of it.

I probably know a dozen programming languages or more.
Some, I'm pretty proficient in. Some, I need to Google almost every other keyword. But that hasn't stopped me putting together a fairly comprehensive Python script in just a couple of hours, that can capture photos from a camera, parse them looking for faces, while responding to commands sent over the serial port, to adjust the brightness/contrast/shutter speed and so on.

It's not because I learned Python.
It's because I know how to code.

And that's a very different thing altogether.
So don't worry about which language you're working in. To get better, just code more. Code lots. And lots and lots. And sometimes you can have a go at recreating one project in another language. In fact, that's probably a great idea. Just for a laugh. So when you've done something, you can feel confident porting it to another language, even if the language is not that familiar to you. That way you know you understand what's going on, rather than blindly copying words off a website to achieve a result.

I started out writing simple programs in BASIC on a VIC 20 as a small child in the mid 80s. Then I moved onto a ZX Spectrum. It was very similar but not quite the same. Then, many years later, when I finally got my hands on a PC, full-colour graphics using QBasic. Some time after that, I was getting paid to program PLCs using ladder logic. And PCs in Visual Basic 3 (long before .vbx eventually became Activex). That lead to coding in C, and later C++.

I've written games for the Gameboy in C and assembly and interfaced with banking databases in Cobol. I loved working in VB and ASP, but slowly had to give way to jquery, javascript and php. I've built entire engineering projects in BASIC on a variety of PIC microcontrollers, and rapid-prototyped designs in C++ in Arduino. I've built apps in java, J2ME, Objective-C and Actionscript. I loved using Flash's Actionscript to write simple online games and adding interactivity to websites, but have now moved over to C# on Unity (at least for games and app development).

Languages are not important.
It's what you're trying to say that matters.

All too often we train students (our next generation of engineers) to learn which words to use (and which keyphrases to find using Google) instead of how to actually write code - how to understand a problem, break it down, and solve each little step along the way.

But it's not always down to the failings of a tutor: too often people expect a complete copy-and-paste-friendly answer, instead of learning how to understand what it is they are trying to achieve. Rather than understand how to solve a problem, most people seem happy with "how do I make this problem go away".

One language is never really "better" than another. It's just something you get comfortable with. English, as a language, has a massive selection of words; it has subtly and nuance not available with other languages. In french, I can ask my neighbours a question, and I'm always "demanding" - je demande. In English, I can ask, I can inquire, I can probe or query. But french also has words that we don't have and since hijacked - for example, there's no real English word for the spooky feeling so beautifully described as deja vu.

If you grew up speaking French, you don't miss the words that have been taken away (the french language is regularly "cleansed" by removing duplicates and tautology - and a lot of anglicisms!). English, by comparison, is a jumbled mess of unnnecessary words, rules and complexities. Which is "best"? Neither. Which are you more comfortable with? Probably the one you grew up learning.

Same goes with programming.
Don't waste time learning a language.
Learn how to code.
Learn how to understand and how to code well.
And the entire world could be better off for it ;-)

</rant>

Wednesday 2 March 2016

Adding a flip function to Adafruit's SSD1306 library for oLED displays

Just as we thought we were getting to the end of firmware development for our electronic guitar neck, Steve came up with an idea that we just couldn't shake off - namely, using one of those nice, neat oLED displays, in place of the clunky, power-hungry 16x2 character displays.

These little displays are not only super-small, but they light up without the need for a backlight - eliminating any potential contrast issues when you're up on stage, trying to select the appropriate display pattern!

These displays are a little more expensive than the 16x2 character displays (£2.80 each against about 99p) but the extra resolution, no need for fine contrast balancing and simple two-wire I2C interface made them altogether too irresistible not to use!


There are already a couple of display libraries available for these displays.
At first we tried the ug8 library which worked quite nicely.
But as soon as we tried to incorporate this into the firmware already developed to control the rest of the guitar neck, we hit problems. In fact, it just plain refused to compile. It's just the frustrating nature of using clashing libraries with Arduino. Another reason to hate it.

But we needed to get our display up and running quickly and, the one thing Arduino has going in its favour is, speed of development. So we thought we'd stick a seperate, dedicated chip on the LCD and simply use it as a display driver, to which we'll pump serial data to get it to display text (and maybe even graphics) on the screen, and little else.

With this in mind, the Adafruit display library looked quite interesting. It uses a large buffer to hold the screen image data, then dumps it all to the screen in one hit - creating a fast refreshing effect, rather than a slow screen re-draw rate, as the data is read off external eeprom or similar.

In fact, it didn't take long (after changing the Adafruit libraries to use I2C address 0x3C instead of 0x3D as was hard-coded in places) to get our display up and running with the sample text provided.


Just as we were about to call it a night, someone asked "what about left-handed players"?
Well, I'm left-handed. I'm about as left-handed a person as I can imagine. In fact, I think I could probably lose my right hand completely and - other than holding multiple crisps or biscuits in it after being offered a packet - I don't think it'd particularly miss it.

Yet despite this, I learned to play a right-handed guitar.
Admittedly, this was because I couldn't - at the time I first started learning - afford a left-handed guitar, and so have suffered nearly a lifetime of being crap at playing guitar and continually dropping plectrums to the point where I don't bother with them any more. If a complete and utter, total leftie can learn to play a right-handed guitar, what's the point in left-handed guitars anyway?

Sadly, this argument didn't wash with anyone else and I was simply accused of being bitter and that was no excuse not to make a reversible/flippable LCD display. Still this didn't wash. So it was suggested that I didn't know how to and that it probably wasn't possible.

The code behind our "flip display" function isn't particularly fancy. And, on completing it and showing that it works, it was pointed out that the Adafruit library has a setRotation function which allows you to change the screen co-ordinate system before drawing to the display. But we built our flip function - and it allows you not just to flip the screen before drawing on it, but actually flips the buffer after it's been drawn to.

Because we're manipulating the shadow buffer, we can do all our clever image manipulation, then simply call the .show function to make it all appear on the screen at once.

It's important to understand how the library draws text/images to the screen. It starts in the top-left corner and draws 8 pixels down (a single byte value). Then it moves along to the next "column" of 8 pixels, and draws those.


So our byte values represent a "stack" of pixels from top to bottom.
If we were to take those byte values and reverse them (so, for example, 11100001 becomes 10000111) we'd effectively draw each column "up-side-down". Which is almost what we want. Except it's not quite.


We haven't actually rotated our image/text, merely flipped it. As well as changing the drawing order of each pixel in the image, we also need to change where it appears on the screen:


So pixel at position zero (top-left) needs to be at position pixel-count (bottom-right).
If each column of eight bits is one byte value, we need the order of our bytes to be reversed, from 0,1,2,3,4,5,6,7 to 7,6,5,4,3,2,1.

This is exactly what we did with our new "flip" function. In the Adafruit_SSD1306.h file we added the single line:

void flip();


And in the Adafruit_SSD1306.cpp file, added the function:


void Adafruit_SSD1306::flip(void){
      // loop through every byte in the buffer and flip it vertically.
      // Since the display draws 8 vertical bits (per byte) we can just flip each byte
      // from MSB to LSB first
      uint8_t t;
      uint8_t p;
      int byteCount=((SSD1306_LCDWIDTH*SSD1306_LCDHEIGHT/8)/2)-1;
      int bufferEnd = (SSD1306_LCDWIDTH*SSD1306_LCDHEIGHT/8)-1;
      for (uint16_t i=0; i<=byteCount; i++) {
            p=0;
            t=buffer[i];
            for(int j=0; j<=7; j++){
                  p=p << 1;
                  if((t & 0x01)>0){ p=p|1;}
                  t=t >> 1;
            }
     
            t=buffer[bufferEnd-i];
            buffer[bufferEnd-i]=p;
     
            p=0;
            for(int j=0; j<=7; j++){
                  p=p << 1;
                  if((t & 0x01)>0){ p=p|1;}
                  t=t >> 1;
            }
            buffer[i]=p;
      }
}

And hey presto - call the flip function before draw  and the entire contents of the display - even after you've filled it with junk - appears rotated 180 degrees: