Wednesday, November 18, 2009

More about Piano Grande 15th Nov 2009

My review of Piano Grande!
As usual, the Goverment House Ballroom proved to have superb acoustic for chamber music.
While amplification gets used there - the people speaking at the lecturn had it - it is rarely needed for the performance. The two Fazioli pianos rang clearly for the piano and forte extremes.
Graeme Gilling and Mark Coughlan played a Percy Grainger arrangement of Bach (a Cantata extract). Despite being a familiar Bach tune it seemed more Grainger than Bach, almost expressionist in style.
Then came a Mozart Sonata for 2 pianos played by Yoon Sen Lee and Kathy Chow. I'm not much of a Mozart fan so this mostly just passed by for me. I was impressed by the calm confidence of both performers and Kathy's lightning flick of the arm to turn the pages of her score.
Next was a short piano duet from Poulenc - much lighter than I would expect of him. Played in a sprightly manner by Mark and Lyn Garland. Mark credited Lyn for suggesting the piece and observed that unearthing some rarely played pieces was one benefit of assembling a program to highlight the two Fazioli pianos.
And it was certainly a highlight that followed as Emily Green-Armytage and Adam Pinto explored with gusto Rachmaninov's Suite No 2 (Op.17). In four movements this was by turns energetic and sublime. As ever, part of the audience applauded by mistake after the "intro" movement but by the end of the third ("romance") I would have welcomed a chance to reward such worthy playing. But with no pause Emily and Adam launched into the "tarantelle" which undulated fittingly all the way to its climax. An energised audience gave a well deserved long applause.
During the break, many in the audience went up to get a closer look at the pianos. I don't listen to much piano music so my opinion can't say a lot. I did consider the higher notes to be free of that uncomfortable xylophone pettiness that I normally note in the upper register.
After interval there was a speech from Barry Palmer covering the other purpose of the concert - the 40th birthday for local firm Zenith Music. I used to live in the suburb where they are based and could verify their long history of supporting music teaching. As a left-handed guitarist Zenith was a must-see when I moved to Perth in 1980 and my two neglected acoustic guitars were both bought there.
The second half proper began with four pianists on two pianos for a Smetana Rondo in C major. I remember that I enjoyed it but can now only recall the cosiness of so many hands sharing the keyboards. Little did I know what was yet to come.
Next was the seven year old Shuan Lee playing opposite his father Yoon Sen Lee for a grand sweep through themes from Yellow River Concerto and more. In some ways this was reminiscent of the Grainger/Bach combination earlier on. A fluid mixture of distinctly Chinese phrasings within the swirling familiarity of that most Western instrument - the piano. Watching Shuan Lee bob about was an unexpected bonus but was the only childlike aspect of his performance. One to watch there.
The next item was listed as a Rachmaninov Romance for three performers. I presumed this would be two at one piano and one at the other. Not so! All three were crammed together at one keyboard. Emily in the middle had to play with almost straight arms as Graeme and Lyn were on either side of her. Somehow the piece was coherent and I gave up guessing which player had which lines. I'm glad for their sakes it was a short piece.
For the final piece Bo An Lu came out and reprised his Young Performers Awards feat of Tchaikovsky's Piano Concerto #1 with Mark Coughlan tackling the piano reduction of the orchestral sections. A tour de force as you'd expect and a perfect counterpart to the end of the first half. I have to say the piano reduction seemed troublesome in places - I'd not heard it before. Indeed I'm fairly sure that was the first time I've a reduction piano arrangement played opposite a solo piano.
All in all the whole concert was a superb exploration of two grand instruments and you certainly didn't need to be a "piano nut" - as Mark Coughlan put it - to be enthralled by magical works in a wonderful space played by great pianists on truly grande instruments. My thanks to all involved.

Tuesday, November 17, 2009

Hyper-realism and Continued Judgement by Probability

We're all familiar with judgement by probability. So familiar in fact that many of us don't realise the extent to which it is our default way of making decisons - and - don't ever ask the question of whether using it so often is wise.
So what is it exactly?
An example would be: choosing a restaurant based on how popular it is. Our reasoning is that in the absence of first hand knowledge this is probably a good choice. I doubt that any of us think it likely that a bad restaurant could be popular.
Another example would be choosing a product in the middle price range figuring that is least likely to be overpriced or of poor quality. We are using an assessed probability to make a judgement.
We often hear the advice has that this is what we should do when we don't have expert knowledge ourselves. I think it should surprise us that the same advice is often given as an indicator of expert strategy.
This begs the question of when should anyone not follow this rule? How often is it not the best choice?
Well, in many situations the concept is based on the idea that we, at that time, are the only ones choosing in that way - i.e, that most other people are indicating a value they idependently recognised, presumably from prior experience.
But this isn't always the case. In my city there is a certain restaurant which is busy and successful, yet it only gets bad reviews and everyone who goes swears never to go there again. It succeeds because it has one unmatchable feature and every goes once for that. My city is large enough that one visit per lifetime for all adults is still plenty of trade for the restaurant to work through for years to come.
Another anomaly is profiling, be it for fraud detection or customer potential. Here an analysis is done, say to find the most common income bracket for a product. Once chosen, that group will be further targeted - the rationale being that this gives the best probability for campaigning value.
[Aside: One fairly obvious place where this doesn't work is in a saturated market. There, it might be more productive to chase the customers unlike those you already have. For the purpose of discussion let's presume that the world in which we are choosing is large and our choice is only a small effect.]
Is choosing the most probable really a good strategy?
There are two different types of situation in this regard, which I will label:
- recursive popularity;
- range frequency.
In recursive popularity the problem is that sometimes the only reason some choice is already more popular than the others is .. because it was already more popular than the others. In the early stages there may not have been any compelling reason at all.
Therefore in choosing it now you are really only selecting the one with a lucky history.
With range frequency, the characteristic is something which has a natural range of values. In people an example would be height. [Aside: you might be surprised how often people make selection biased about height.] A classic example of this is using an economic metric - of a business, or a customer - to bias a selection. Imagine any such metric. Most likely it will have something akin to a normal distribution. There can be many reasons why those parts of the population away from the norm are abnormal in other respects too. If we use height to predict clumsiness we might effectively just discover that its easy to find tall clumsy people because they hit their heads on doorways more than short clumsy people. So if our way of detecting clumsiness is to look for forehead bruises then yes we will probably find that preselecting tallish people is more productive than random selection.
[We may also find it better to concentrate on the averagely tall rather than just start from the tallest and work down. This might be because there are not many very tall people and they know well to duck their heads in doorways.]
This idea of selecting away from the norm has been a catchy meme in some quarters, especially as it seems "targeted" yet conveniently selects a suitable number of potentials who are inherently a bit _different_.
The problem here is that reverse thinking is at work. Even if the subset is truly productive the unasked question is where are most of population that we're seeking? I suspect that many of them remain unnoticed amongst the throng nearer the norm.
These issues become even more pertinent where analysis is trying to uncover the results of human guile. The best place to hide a leaf is in the forest - a principle well known to those with something to hide.

Monday, November 16, 2009

Are IT folk really luddites?

While computing generally lends itself to automation I frequently get the feeling that those in charge of IT are not actually keen on it.
Now, by "those in charge" in this context I'm not pointing the finger at IT management (although I'm certaiinly not exempting them either). Rather, my focus is all the people who get their hands dirty with IT day in day out. When was the last time you heard anyone in IT say "ok, I've done what you asked and I've made a button you can push next time you want the same thing"? I'm guessing that by now, most people have NEVER heard that from IT.
Why is that? What went wrong?
I can think of several answers but I don't know which is right.
1. Some amount of control has passed from admin to user which had the dual effect of not giving IT folk a reason to be involved but also to be hired with those skills. To older code warriors this is hard to fathom - what sort of IT person cannot code?
2. Another explanation is the bureaucratisation and outsourcing of IT. If you're providing an IT service may seem not in your interest to let the customer not need you. That seems to play the same for internal as it does for outsourced IT.
What I see as a paradox is that in many ways automation is tantalisingly closer than ever. So many tools are available - much for free - and yet my impression is that the typical user is still as far away from using them as ever.
And do IT folk see it as their role to assist with this? Rarely.
If it weren't for the Web 2.0 revolution then we'd be going globally backwards. Things like Wordpress and Facebook mean there are millions of users getting on with content creation and sharing with almost zero contact with IT folk.
A good measuring stick is a workplace that I know of where there is no use of Web 2.0 and even little of Web 1.0 systems. In that place, general skills in using IT have atrophied and it is quite normal to see people doing repeated "manual" tasks on their computers. There no culture of and little awareness of using automation. It is telling for example that there is no automation of email handling at the client level beyond simple filtering.
Maybe my perspective is warped but it seems to me there is a new divide between the tech-knows and don't-knows. The division is not the old one of programmers versus non-programmers. Instead the split is between those that know things can always be better and those who only use whatever they've been shown how to use (and could follow).
It used to be that most people I knew in IT were actively interested in shifting from one group to the other. Now though, I'd say most IT people prefer users to stay dumb. They even inhibit the awareness and use of automation tools. They are thus the new luddites - breaking the machines lest they come into full operation.
From things I have seen this problem has become difficult to work around. Imagine you are a small business that needs an IT setup. Who can you find who will act in your interest rather than theirs? The ideal IT job makes itself obsolete and leaves the customer with little extra need for service. I'm not saying that there are no businesses out there doing that. What I notice though is that the really successful IT service businesses provide a generally poor service - and this may be precisely why they are successful.
In the industrial revolution, the luddites failed to win the war. For IT the outcome is yet unclear but it worries me that maybe the new luddites are winning.

Piano Grande at Govt House

Went to a brilliant concert at Govt House Ballroom - "Piano Grande" based on 2 Fazioli pianos and a pound of Perth pianists (yes I had to Google for that collective noun) playing pieces for pairs of performers plus permutations [ok, that's enough with the Ps :Ed].. umm particularly pertaining Poulenc?
I'm not that into pianos but this was by turns breathtaking, reflective and indulgent. My hat's off to Mark Coughlan and all involved for such an opportune event.