2014_02_12

Got my MSc dissertation mark today. Passed!

Code had 32 sets of ‘place markers’ – one for each LA, all basically identical except that each LA’s markers were added to a different cluster group. Nightmare to maintain – if I change one thing, I have to change it 31 more times (exactly the same change). Chances for mistakes very high.

So spent today hacking this to a routine. Not helped by wasting hours forgetting that ‘=’ sets a variable’s value, while ‘==’ gets the value. GRR!

I now have a forest of markers over Scotland.forest

Next step is to bring back the clustering.

2014_02_11: to do list

Peter: CC location finder
Edit Bruce’s draft IIDI item, then tweet it IN PROGRESS RIGHT NOW
Sort permissions on cngn.co.uk (MAY NOT BE NECESSARY – the webserver bit works)
Bruce: CC location finder
Retweet PAC’s tweet
Get task trackers up to date. DO THIS EVERY DAY! DONE
Update MoSCoW list for CC location finder (include bounds to pan/scroll and search) DONE
Refactor code using array for list of LAs (not 32 repeated lumps of code)
Next most important task: limiting pan/scroll and search
White background for site
Find a way of batchconverting postcodes to LatLong
Testrun it on 2011 data
Bruce: CeDEM and CC website survey
Draft tasklist for CeDEM paper revisions. Concentrate on stuff that would help survey project
Bruce: funding
Implement PAC thoughts (does Aims bit include appropriate words, dewhiteman text)
Get Hazel advice on language
Get Hakanpaa, Nina N.Hakanpaa@napier.ac.uk advice on costs
Bruce: Commission on Strengthening Local Democracy
Respond I will make it, PAC would like to come but has teaching commitments
Set date with Peter for discussion of what we might say

2014_02_10

Not hugely productive today. Some chat with a colleague about CCs in her LA being, er, less effective than she’d like.

Attempts to go back to unhacked versions of leaflet, etc, so i could properly document where I changed locations of the image files they call.

Some work on a funding application – I’m useless at those!

Knocked together all the things supervisor and I could talk about at tomorrow’s management meeting.

Er, that’s it

2014_02_09

So I’ve given up on the polygons for now. In fact, I’ve received some advice in response to an online plea for LatLong shape files for Scottish LAs but I won’t be able to look into this until Tuesday afternoon.

So today on with the show of drawing a map with markers for all CCs for which I have LatLong data. This should have been easy – I’d had something very like this working for 2 CCs already, with different coloured markers for whether the CC had both website and email address, just an email address, no electronic contact details or no actual functionality.

I started by knocking up some css and a disclaimer/copyright page and some custom markers for functional and non-functional CCs – easy, if not the prettiest thing ever. Then I followed the recipe for displaying a couple of LAs’ CCs (without differentiating between functional and non-functional), with the markers for each LA coalescing when zooming out from the map.

All fine, until I noticed the numbers in the coalescences were not correct – there should have been 19 CCs in Aberdeen and 71 in Aberdeenshire, but while Aberdeen was correct, Aberdeenshire had 90 CCs, so the markers for Aberdeen were presumable adding to the Aberdeenshire set.

I then added in Angus CCs, and got even more confusing numbers. I battled with this for several hours, even stripping the code right down to for each CC in the whole dataset, document.write(“rude word”); No joy.

So this is where my personal life affects my work life – I went to my regular sunday evening spin class. I started again about 8:30 and within 30 minutes, I had it properly working numbers for 3 CCs. 29 fairly tedious additions to a case statement and checks that the right numbers appeared, all 32 LAs’ CCs were appearing correctly.

A quick hack to one LA got me green marked for functional CCs and red marked for non-functional ones, with each LAs’ own CCs coalescing into one lump no matter what colour they were.

Then half an hour’s web-trawling got each LA’s CCs’ markers displaying popups linking to the relevant LA web pages. Job done!

So entry to the page draws a map of Scotland, with all CCs for which we have LatLong data coalesced into lumps that show the number of CCs in each coalescence. Zooming in, or clicking on a coalescence de-coalescesces so that markers for individual CCs display. A marker is green if the CC is functional and red if it isn’t. Each marker has a popup with the name of the CC and a link to the relevant LA’s webpages about its CCs. Entering a postcode or address in the search box zooms to that place, so you can see the CCs around it. (You may need to zoom out a little.) We will draw a discrete veil over the question of how good the LatLong data I’m currently working with is – not my problem!

Still to do

  • display LA boundaries
  • get the search box to drop a marker
  • optionally, calculate the distance to the nearest CC
  • optionally, calculate the distance to the nearest functional CC
  • be able to switch on and off each LA’s markers, using leaflet.js’s layers functionality
  • optionally, to calculate distances to the nearest (functional) CC in an LA if only one is switched one
  • write a hacker-guide
  • style links in <h1>s

Getting there!

2014_02_08

Just started processing Shetland shape file. There are 549 polygons. There are 201,158 co-ordinates. Updated process:

  1. Prepare a Word text file called Shetland.js which contains only var shetland = [ ];
  2. In a copy of the file containing the data for Shetland, replace all ] ], [ [ with two <new line> in Word. Save.
  3. In the same file, replace all ], [ with <new line> in Word. Save.
  4. Get rid of dross at beginning and end of file. Save. You now have just co-ordinates, with a double <new line> between the set for each island.
  5. Find first double <new line>.
  6. Cut all co-ordinates preceding this. Save – you don’t want to risk doing this set twice.
  7. Paste the clipboard into the online batch converter, then convert.
  8. Copy the converted data into TextMate. Save.
  9. Do the first RegEx to get rid of the dross from the conversion step. Save.
  10. Do the second RegEx to replace all <new line>s with ], [, then save.
  11. Add , [ [ to the beginning of the Textmate file, and ] ] at the end. Save.
  12. Copy the whole of the TextMate file and paste it into shetland.js just before ]; (For the first set, you don’t need the initial comma.) Save.
  13. Refresh the map webpage – another island will have been highlighted.

2014_02_06 encore

I couldn’t leave it alone this evening – 5 hours after I got home and I’ve done the 6 LAs on Scotland’s mid-east coast. The wee bit that was missing from Aberdeen has been restored. (I processed that dataset again and the missing points magically reappeared. So here it is in glorious technicolour:

Screen Shot 2014-02-07 at 04.46.30

The points are very close to the LA boundaries on OpenStreetMap tiles, which is quite encouraging. The dashed line is OpenStreetMap and the solid line is Leaflet’s rendering of the converted data.

Screen Shot 2014-02-07 at 04.46.12

Fife is a 4-part multipolygon, that is there are four polygons in the bits that represent Fife. I guess these are the Isle of May, Inchkeith and Inchcolm. Each part of a multi polygon requires separate processing via the hero of the hour(s), http://gridreferencefinder.com/batchConvert/batchConvert.php.

So I’m dreading doing Shetland, Orkney and the Western Isles. But perhaps they don’t need to be marked or coloured because they are clearly separate from the mainland.

Less heroic have been Mac TextEdit (no RegEx), TextMate (one successful RegEx, then fell over and won’t stop trying to process a long finished-with Angus file) and Word (it has done most of the heavy kiting but has fallen over at least 10 times). I’ll forgive Eclipse for chugging away with RegExes very slowly but I won’t forgive it for completing the RegEx then hanging so I can’t save the result.

The optimal process seems to be

  1. Open the original data file in Word.
  2. Find any sub-polygons, by replacing  ] ], [ [ with multiple <new line>s.
  3. Save.
  4. Grep ], [ with ,<new line>
  5. Save.
  6. Get rid of the dross at the start and end of the of the file, so that I’m left with just sets of [number,number],<new line>[number,number], …[number,number],<new line>[number,number].
  7. Save.
  8. Manually copy and paste the first set into the heroic web page, and tell it to convert. If it moans, there is more than one set in what I’ve copied, so go back and sort it out. At this point, Word will fall over – hence the multiple saves,
  9. When a clean data set is in the hero, it will convert the data to lines containing some unwanted blurb and the data in LatLong format.
  10. Copy that bit to a new TextEdit file called <LA-name>.js, then leave some spaces.
  11. Convert the next data set, i.e the next polygon in the multipolygon.
  12. Copy and paste that data into the TextEdit file.
  13. Repeat steps 8-12 until all polygons are in the TextEdit file. Save and close it.
  14. Open it in Eclipse. Do the RegEx to remove the unwanted blurb. (To play it safe, do a few hundred lines at a time, saving each time.
  15. Save and close the file – Eclipse is about to fall over.
  16. Open the file in Word again, then replace all multiple new lines with a suitable swearword.
  17. Replace all new lines with ], [
  18. Replace the swearwords with ], [
  19. Top and tail the file so that it has the format var <LAname>  = [ [ [number, number], … [number, number] ],  [ [number, number], … [number, number] ], …  [ [number, number], … [number, number] ] ]; 
  20. Save it in the right place in the website data structure, add the necessary call to index.html’s head and the necessary Leaflet call to drawMap.js.
  21. Open the web page and draw the map.
  22. Fix the colours in drawMap.js
  23. Move on to the next LA!

2014_02_06

  1. got UK local authority data from https://www.sharegeo.ac.uk/handle/10672/305
  2. converted .shx file to .js using ogr2ogr -f “GEOJSON” newfilename.js sourcefilename.shx
  3. open resulting 88MB file in Word – around 10,000 pages.
  4. Find first mention of Scotland – it’s about 7000 pages in. It’s to do with Angus.
  5. Find second mention of Scotland. This is in the code marking the start of the data for Clackmannanshire.
  6. Select all the data for Angus, then cut it out and paste it to a new document. Save that as Angus.txt.
  7. Repeat steps 4 to 6 to get data files for all 32 Scottish LAs.
  8. Such a shame they are in National Grid format, not lat/long.
  9. proj4leaflet claims to handle other projections. Trying it but the huge size of the clackmannashire data file seems to be killing Eclipse. So have abandoned that, going back to what we have so far and trying to convert NG co-ords to latlong.
  10. http://gridreferencefinder.com/batchConvert/batchConvert.php batch converts but there’s a lot of Word/RegEx jiggery pokery to prepare the stuff for batch-conversion, then convert the results to array format.
  11. It almost works for Aberdeen, but a few data-points seem to be missing.Screen Shot 2014-02-06 at 19.00.04Screen Shot 2014-02-06 at 18.57.41
  12.  (Ignore the blue line and shading in the first screenshot – it’s part of a very rough outline of the UK, from when I was trying to get to grips with geoJSON use in Leaflet.Screen Shot 2014-02-06 at 19.01.54
  13. The conversion step is choking my mac, so time to ask the internet if anyone has LatLong shapefiles for the Scottish LAs.
  14. Here endeth today’s lesson.

2014_02_05 Progress

I can’t remember much of last week. My diary tells me I was occupied every weekday evening. Monday, Wednesday and Thursday evenings were at gyms, with a brief visit to a social media surgery before Thursday evening’s gym session. Tuesday and Friday evenings were spent helping a community council set up its new WordPress blog.

I know I spent Thursday and Friday creating management tools for the two community council projects. This also took up some of Monday (3rd). Tuesday saw a very brief planning meeting, then I jumped on a train to Glasgow to meet the ever-delightful Heather Burns, then go to Ofcom’s Digital Participation round table, then come back from Glasgow to help the CC with its website again. (There isn’t any real problem with it – it’s a standard WordPress.com website but linked to a .co.uk domain. I haven’t yet set up email addresses using the domain – I suspect that might be a headache.)

Today I’ve sent data to the visualisation client so they can verify it. (The data is from 2011-12 and so is out of date – the client’s contacts should clean it up so the project’s website accurately portrays where things are.)

The latest version of the code shows I can display a polygon based on a geoJSON file (tutorial).

Screen Shot 2014-02-05 at 18.22.07

I also know I can obtain open shapefiles for LAs from the Ordnance Survey. The intervening step is to convert shapefiles to geoJon. Supposedly GDAL does that but I’ve not yet persuaded my mac to run it. Following this tutorial may help – but that’s for tomorrow!

Update

This tutorial shows how to actually use geoJSON files!

Monkey magic

Code Monkey

As of Friday morning, there is a verbal agreement between a client and Edinburgh Napier University that the university will create some code for the client. And I’m to be the main code-monkey! I’ll even be paid. Not much, which is fitting because I am a very new code-monkey and this will be my first ever paid coding project, and because the project is really quite small – just proof that the concept can work and that Napier can do things the client wants. Nevertheless I’m very pleased.

Domain devilry

It’s also pleasing that 1&1, the providers of the domain to which I’ve mapped this blog, have finally fixed the fault which was preventing this domain mapping. I bought the domain on 12 January and it’s taken until today to get the issue sorted. There’s also an email address (bruce AT bruceryan.info) but it’s simply a forward to my main personal address.

Community Council news 1

I’ve had another meeting with the community councillor who wanted to set up a website for his CC. There appeared to be two copies of http://<name of CC>cc.wordpress.com and one copy of http://<name of CC>.wordpress.com. Each of these had bits of the most up-to-date material but none had all of it. So we copied the most recent content to separate text files on his laptop, deleted the errant blogs from WordPress, started a new blog at http://<variant of name of CC>.wordpress.com, reset the them, remade the pages, tags and categories, then remade the posts. Setting up widgets went quite quickly. He was very pleased to see that WordPress blogs can include Twitter widgets. While his CC doesn’t yet have a Twitter account, he has a personal Twitter account and a personal blog. Adding the widget for the former to the latter took about 2 minutes.

He now has to take his efforts back to his CC for their approval, comments etc. We’ll then implement any desirable changes, set up and train a co-editor. Then I’ll keep a watching/mentor brief as the site develops. I’ve also interviewed the community councillor about his aspirations for the website, his level of ability and similar. In a few month’s time, I’ll interview him about what actually materialised – and that should be the basis of a research paper. It will also be of interest to others who need proof that the CC system is worthy of further investment in training.

Community Council news 2

I’ve been asked to take minutes for another Edinburgh CC. It will pay a small amount but will knock out one Wednesday spinning session a month, but that’s not a problem – I could replace it with a spin on Tuesday or thursday evenings. This CC also wants a website similar to the one I help run for Leith Central CC, so that should be easy enough to start off.

Research news

On Tuesday I’ll meet with my supervisor to plan some further research into community council websites. This will be an update of research we did in summer 2012. We’ve won a small grant for this research, enough to pay me for about 8 weeks. This should lead to a couple of research papers, I believe.

Physical space

My old G4/800 now has a new home way out west. It’s good to keep old silicon running.

Not yet started

  • a good practice guide for community council websites
  • writing papers based on my MSc dissertation
  • a serious attempt at setting up a domestic server

Watch this space.

Unicodswallop

One of my friends’ duties include setting up electronic billing for her employer. She recently told me

I am creating a new payment file (as European standards are changing) for the equivalent of BACS, and am somewhat upset to discover they won’t accept supplier names with accents in – which for a European bank, strikes me as remarkably inconsiderate.

I enquired further and she replied

The reply I got back from the bank included the line ‘In characterset UTF-8 special characters like Ö, & , é… are not declared.‘ See page 7 of  http://www.febelfin.be/sites/default/files/vademecum/Standard-XML-SDD-Initiation-v20b-EN.pdf.

While that page indeed does not have accented characters and Latin-1 is just ASCII (no accents), UTF-8 does include accented characters. It’s a way of representing every character in Unicode!