Thursday, 30 July 2009

The Post-War Consensuses in Britain

Whilst I might have lost some of my short stories in the shift between computers over the past few years, I am often surprised what other things I stumble across untouched for many years when looking in old files. This seems to be an essay I produced in about 2001. Since then I imagine many people have written on the issues it raises but interestingly some of it seems to be relevant again as we stand on the cusp of a general election. As I note, with reflection, it seems possible that the next government may try to shift things back to policies of the Baldwinian consensus of the 1930s and it is interesting to look at the current policies of David Cameron's Conservative Party in this context. So, this essay is dated, but I think some of the themes remain relevant and anyway interesting.


The Post-war Consensuses

The word consensus is often used in particular to describe the British political scene after 1945 when it is argued that the assumptions of both the Conservative and Labour parties were the same on many issues. As Dennis Kavanagh and Peter Morris state in ‘Consensus Politics From Attlee To Major’ (1994) consensus refers to “a set of ideas and conventions about the nature of the scope of political - and particularly governmental - activity. ... in other words, a set of governing assumptions and expectations.”

Britain seems to have gone through phases in its political histories when there have been such assumptions unchallenged by the two main parties. More broadly it is argued that consensus is always apparent in the British political system as there are few challenges to the way the system of civil society works or even to the basic institutions in Britain. It is rare that any revolutionary alternatives from the left or right have come forwards. Partly this continuity has been provided by the continuity of civil servants especially those who rose from junior positions in wartime to senior ones by the 1970s. In addition, there are long-term assumptions fostered by Treasury culture which any minister finds hard to resist, for example, defence of the pound and a need to restrain expenditure. Kavanagh and Morris point to the perpetuation of the grammar school system certainly to the 1970s and even today, and to restrictions on immigration introduced by both Labour and Conservative governments as showing a commonality of assumptions by the two parties. Other examples would be the enduring membership of NATO, the continuing close relationship with the USA, ambivalence towards the European communities and a desire to maintain nuclear weapons. Whilst other policies have been voiced, especially when in opposition, once in power, both the main parties have adhered to these established approaches.

The periods of the clearest consensus have, however, been occasionally interspersed with much sharper adversarial politics, most notably between 1979-94 when a new consensus was being established. It could be argued that a consensus only really comes out of upheaval and it solidifies when the party in opposition realises it stands no chance of being elected unless it adopts the policies of the other main party. With this in mind we will look first at the key phases of consensus in the 20th century.

1) Consensus in the 20th Century
In the 20th century it can be argued that Britain saw three consensuses. The first, in the 1930s, extending from the ideas of Stanley Baldwin (Prime Minister 1923-4, 1924-9 and 1935-7) argued for a laissez-faire approach, without large-scale governmental intervention in the economy even in times of depression. It was summed up by reference to deflation and an attempt to maintain the role of sterling as a world currency and yet work within the closed bloc of the Sterling Area. [Whilst in 2001, I would have not imagined this consensus returning, in 2009, it seems very much like the approach that the Conservative Party under David Cameron favours.]

1a) The Attleean Consensus
The best known consensus came in 1945. The concept of the post-war consensus was effectively developed by Paul Addison in his 1975 book, “The Road to 1945”. Kavanagh & Morris define the consensus as based on two elements. The first was agreement on the style of government involving institutionalised consultation between the government and those important in the economy, i.e. employers and unions, something which developed during the Second World War. Keith Middlemas in ‘Power. Competition and the State, 1, Britain in Search of Balance, 1940-61’ (1986): calls this the ‘continuous contract’. The second was the range of policies which persisted from the late 1940s to the mid-1970s in handling social and economic needs, broadly termed the Welfare State and Keynesianism.

To some extent, consensus after 1945 can be seen as reflecting the new political reality of the post-war world. In the inter-war period though the Liberals faded away Labour had not yet really staked its place firmly in the British political structure, and the majority of Labour MPs joined the National Government coalition alongside Conservatives and Liberals in 1931. In this inter-war period then, the consensus was still the one between Conservatives and Liberals as the main parties. Post-1945, Labour was the clear rival to the Conservatives and thus the consensus was between these two, slightly further apart.

The unexpected victory in 1945 of the Labour Party in 1945 committed to nationalisation of the ‘commanding heights’ of British industry combined with the liberal attitude to developing a welfare state, reflected a shift in popular opinion towards favouring a more interventionist approach by government. This stemmed from the failure of laissez-faire economics in Britain in the 1930s in contrast to the USA’s New Deal and the evidence of government intervention in the Second World War which was seen as effective in ensuring sufficient war production. Labour’s objectives were generally realised in its term in power 1945-51 - 20% of industry was nationalised and a welfare state embracing the NHS, social security and education, was created. Importantly, in terms of the widespread support these policies received, aside from nationalisation, many of the ideas had come from two Liberals - William Beveridge and John Maynard Keynes.

William Beveridge’s report during the Second World War laid the ground work for the welfare state. It was Liberal rather than Socialist in design, building on the reforms by the Liberal governments of the pre-First World War period. It relied on full employment but also on the fact that people would be happy to contribute to insure themselves against future need such as ill-health, unemployment and old age. The fact that the system was funded through national insurance contributions, rather than taxes as would be the case in a Socialist system, reflected this. In addition, it was the Labour government which introduced charges on dentures and spectacles in 1951 as a way to better fund the system.

John Maynard Keynes was another Liberal, an economist. His writings in the inter-war years had aimed to reduce the bitterly high levels of unemployment. He saw this problem as caused by insufficient demand in the economy. Simply put, his argument was that the government should stimulate demand in the economy through manipulating the interest rate system to encourage investment. Though he saw a role for direct government intervention he certainly did not back large-scale nationalisation or direction of the economy in peace-time. Keynes’s ideas were first adopted in the 1941 budget to act against excessive rather than a lack of demand. Despite a brief flirtation with economic planning after 1945, by 1948 Labour was committed to using Keynesian methods to manipulate the economy. The commitment to full employment inherent in Keynesianism was a key part of the Attleean consensus until it became untenable in the 1970s.

Added to these two key elements was a willingness to work with trade unions. The tripartite approach of wartime of government working with both employers and unions was followed throughout the post-war years and received a boost from the Conservative government of Harold Macmillan in 1961 with the creation of the NEDC which brought together employers, unions and government in an attempt to effectively plan the economy and promote growth. This was an approach built on, though not successfully, when Labour came back to power in 1964. This really continued until the strikes of the early 1970s and the election in 1979 of Margaret Thatcher, a strong opponent of unions.

A further element of the post-war consensus was foreign policy. The first post-war Foreign Secretary, Ernest Bevin, previously a leading trade unionist and an ardent Labour member, was in fact conservative and almost Conservative in his outlook. He was a committed anti-Communist and welcomed assistance from the USA. He favoured the development of NATO to secure US involvement rather than the creation of a European Army. In addition, the 1945-51 Labour governments, like all British governments subsequently, continued developing or buying nuclear weapons. This attitude was only challenged by Labour in the early 1980s when in their most radical phase. All of these policies, adopted 1945-51, were continued by the Conservatives when in office.

The relationship to the Europe communities has always proved a more complex issue, but there were supporters and opponents of European integration in both leading parties, and Conservative prime ministers Harold Macmillan and Edward Heath applied to join the EEC as much as Labour prime minister Harold Wilson. There was also an ongoing consensus on the need to peacefully disengage from empire, despite the discomfort of some of those in the Conservative Party.

Though Conservative support for the use of Keynesianism and no attempt to scrap the Welfare State in the period before 1975, are often taken as showing that the consensus had shifted the political ‘centre’ towards the ‘left’, there are a number of factors to recognise. As mentioned above, these basis of they key consensus policies had been developed by Liberals rather than Socialists. Winston Churchill, prime minister 1940-5 and 1951-5 had himself been a Liberal 1906-22 and involved in a reforming government that had laid the foundations of the later welfare state. Thus, there was less distance for Conservatives to go to accept these centrist concepts than if they had been truly Socialist. Indeed, Churchill and the Conservatives stood more firmly against policies they saw as Socialist, and opposed economic controls and some elements of nationalisation, for example steel and road transport which they denationalised when the returned to power. It was seen as acceptable to Conservatives on efficiency grounds, however, that other elements of the UK economy such as coal mining and the railways were nationalised. State control had been adopted right across the economy in the wartime and was seen to be pretty efficient long after the war had ended; the final wartime economic controls were not scrapped until 1956, five years into the Conservative period in office. The Conservatives accepted if not celebrated the mixed economy which was in part state-run but still predominantly capitalist. This was summed up in the phrase ‘Bustkellism’ to characterise economic policy of the 1950s, combining the names of the Conservative Chancellor after 1951, R.A. Butler with the Labour Chancellor 1950-1 Hugh Gaitskell.

Being out of power for 13 years, 1951-64, encouraged the Labour Party to reflect on its policies. Though not envisaging a large shift of the kind carried out in the 1990s, there were attempts at adjustment in its outlook especially to address the growing prosperity of 1950s and early 1960s Britain. In 1956, influential Labour thinker Anthony Crosland questioned whether the commitment to state ownership of industry, which Labour had held since 1918, was relevant in the era of prosperity which Britain was then experiencing. In 1960, the Labour leader Hugh Gaitskell tried to scrap this commitment by abolishing Clause 4 of the party’s constitution. He was defeated and following Gaitskell’s death in 1963 and his replacement by Harold Wilson who favoured a more Socialist and technocratic approach, this brief concern for Labour came to an end until it was revived in the 1990s to face a new consensus.. In addition, ironically with British competitiveness seeming to flag in, by the 1960s the Conservatives were moving towards planning the economy. To some degree this was influenced by Harold Macmillan (Prime Minister 1957-63), who in the 1930s like many across the British political spectrum had seen some degree of economic planning as the only way to lift the country out of the Depression.

1b) The Myth of Consensus?
In the mid-1990s, most notably from Harriet Jones and Michael Kandiah eds., ‘The Myth of Consensus: New Views on British History, 1945-64’ (1996), there was a challenge to the accepted views of the post-war consensus. Partly they and other historians argued that the Cold War, as much the Second World War, shaped the post-war policies, for example, in encouraging co-operation with unions and a mixed economy. In addition, they feel that factors such as the speed with which Britain decolonised prevented greater diversity of opinion arising.

Historians such as Kandiah and Nick Ellison, in ‘Egalitarian Thought and Labour Politics: Retreating Visions’ (1994), show that within the Conservative and Labour parties there was a wider range of opinions on a whole range of issues than is often discussed. For example, they point to the 1958 resignation of the Conservative Chancellor of the Exchequer, Peter Thorneycroft, who sought a more monetarist approach as opposed to Macmillan, the prime minister who much more favoured an economic planning even interventionist approach. Similarly, within the Labour Party there were always those who favoured nuclear disarmament and aimed to push on nationalisation to embrace a far larger section of industry. This is taken to suggest that support for Butskellism was in no way comprehensive even within each of the parties let between them.

Other historians such as Jones and John Ramsden in ‘An Appetite for Power: A New History of the Conservative Party’ (1999) show that the Conservatives had a different view of the welfare state. They favoured private rather than council construction of houses; targeted rather than universal benefits; private rather than state-backed earnings related pensions, all policies which were the reverse of the Labour approach. Ben Pimlott, as in ‘Harold Wilson’ (1992), claims that by the late 1960s and certainly in the 1970s, there was difference between the two main parties on even fundamental issues. Andrew Gamble and S.A. Walkland in ‘The British Party System and. Economic Policy 1945-1983’ (1984) in fact portray the 1970s as a period in which the adversarial nature of politics was so harsh that it was disruptive to the economy. Overall, one must be careful not to just take the concept of consensus as applying to all policies or as being unchanging over time. This is too simplistic and, whilst consensus remains a useful tool, it should not just be taken unchallenged.

2) The Thatcherite Consensus
It seems difficult to see how the Labour Party could even approach Margaret Thatcher’s policies which emphasised monetarism. Her approach to the economy and by extension society shook the assumptions of the Conservative Party itself. The shift away from the old-fashioned paternalistic One-Nation Conservatism came in a number of steps. First was Edward Heath’s 1970 victory based on the so-called ‘Selsdon Programme’. This aimed to break with the Keynesian approach to the economy, or what was perceived as the Keynesian approach, though in the post-war era in popular usage the term had extended a lot to cover far more state intervention than Keynes would have approved of. The programme aimed to reduce state involvement in the economy, to privatise industries, curb union power, free market forces and encourage greater industrial efficiency.

This approach was short-lived and by 1972 had been abandoned in favour of greater state intervention to bale out failing industries. For the first time in the British economy high inflation was combined with rising unemployment. This was partly sparked by the end of the post-war boom provoked by the quadrupling of fuel prices. Cheap fuel had enabled the boom to continue from around 1950 onwards. The problems worsened for Labour when they returned to power in 1974. To cover its debts the government had to borrow from the IMF (International Monetary Fund) which forced deflation on the British economy.

In 1976 the prime minister James Callaghan abandoned the commitment to full employment which both parties had backed since the war. He also said ‘[w]e cannot now, if we ever could, spend our way out of a recession.’ This effectively marked the abandonment of the core element of Keynesianism, that the state should stimulate demand through expenditure. In addition, this step was paving the way for the harsh monetarist approach adopted when Margaret Thatcher came to power in 1979. Jim Tomlinson has argued in ‘Public Policy and the Economy since 1900’ (1990) that the economy in the later 1970s shows that Keynesianism does not work. Effectively it was not tested through the post-war boom when it was not needed to counter unemployment. When it was needed, in the 1970s to tackle that problem, it failed. However, it must be recognised there were other factors such as external inflationary pressures that prevented a Keynesian solution.

Thatcher was won over to monetarism in 1975 and through it gave a new, apparently more dynamic direction to the Conservatives. Once in office 1979-91, for Thatcher, monetarism was just the foundation of a broader economic and social policy. This involved ‘rolling back’ the state by privatising almost all of the nationalised sector combined with continued tax cuts proved popular with the public which was seen as stimulating the economy. The approach, however, also included centralising powers from local authorities, imposing expenditure cuts and in theory giving greater choice in terms of welfare services, though this was all achieved through greater central state intervention.

Kavanagh in ‘The Reordering of British Politics: Politics After Thatcher’ (1997) sums up the essence of Thatcherism which we can see forms the foundation of the Thatcherite consensus. Of course, how these ideas were translated into policy was not always accepted even across her own government or party. However, as with all enduring attitudes it is their presence over time and, thus, them becoming assumptions that people no longer challenge, which forms the foundations for a consensus. The assumptions were that: delivery of services comes better from the private rather than public sector; government can do little good but can do great harm and so its involvement and expenditure must be restrained and allow free enterprise to boom. Thatcherism emphasised self-help but also responsibility for one’s actions and thus taking the blame for one’s own situation, rather than expecting the state to alleviate this. Of course, the state was seen as having an important role in both defence and law and order; expenditure on these was not to be curtailed. In foreign policy Britain had to establish its continued importance through military intervention, nuclear weapons and its close relationship with the USA. Though the harsh social attitudes moderated somewhat under John Major (Prime Minister 1991-7), the foreign policy line is still visible today.

Though the years of Thatcher in power saw some of the greatest antagonism between the two main political parties, by the mid-1990s things were changing. Neil Kinnock, Labour leader after 1983 had proven successful in purging the extreme left from the Labour Party but the fourth defeat in a row in the 1992 general election indicated that if they wanted to return to office, Labour had to do more than simply moderate their own party. Tony Blair became leader following the sudden death of John Smith in 1994. He had a clear objective of moving Labour towards, what, after 15 years of Conservative rule, was becoming established as the new consensus. This approach was embodied in New Labour. Unlike Gaitskell, Tony Blair in 1995 he was able to scrap Clause 4 and the Labour Party’s commitment to nationalisation.

Once in power Labour showed how far it had adopted the Thatcherite consensus. It kept to the Conservative expenditure planes bequeathed by John Major. It went further and made the Bank of England independent, so giving up an essential element of Keynesianism: the ability of government to manipulate interest rates. The changes to the welfare state particularly in education and health, brought in under Thatcher were only slowly altered or were not reversed. The government aimed to privatise the London Underground and air traffic control and repeatedly discussed privatising the Post Office. The only real break from Thatcherism has been devolution to Scotland, Wales and London and it is clear that in these locations the Thatcherite consensus is less popular than centrally. The fact that since 1997 the Conservatives have struggled to develop distinct policies shows how much Labour has captured the consensus ground of Thatcherism for itself.

3) Conclusions
In ‘The Civic Culture’ (1963) and ‘The Civic Culture Revisited’ (1980) two US political scientists, Gabriel Almond and Dan Sidney Verba argued that British society was stable and did not face violent political challenge because of three trends: continuity, deference and consensus. Consensus in Britain is seen to embrace not only individual policies but also assumptions about the political system as a whole. It is difficult to argue that British institutions have been challenged to any great extent and thus can be stated convincingly that there is a consensus on how the British political system runs and what issues it should address. However, as this essay has shown, the consensus on policy is more complex.

It does appear that assumptions can penetrate into the British political scene which then make up the thinking of both the main parties. However, it is wrong to assume that consensus is a blanket attitude. Within it there is room for diversity between and within the political parties. In addition, at certain periods especially the 1970s and 1980s the differences between the main parties have been very sharp, and have encouraged the reverse view that Britain was plagued by the severity of its adversarial politics. Thus, consensus remains a valid concept for interpretation of British politics and society in the 20th century but one which must be used sensitively and with care.

Tuesday, 28 July 2009

Interviews: A Real Song And Dance

It seems apparent that with rising unemployment, companies are looking for new ways to sift out candidates for their vacancies. Having been applying actively for jobs since February of this year, I have had seven interviews and have just been invited to my eighth, so the fashions in interviewing are of concern to me. However much employers deny it, there are fashions in the style of CVs and how interviews are done. Often the interviewers themselves cannot articulate what the fashion entails, just that they, and even more often the human resources staff who advise them, 'know' that there is a 'right' and a 'wrong' way, which may be very different from what was prevailing two years earlier. Of course, as I have outlined here, in many companies, no-one knows how the human resources department recruits people and interview panels often show clear surprise at the nature of the application forms they are given to consider.

I have never been a fan of television programmes in which people are voted off. However, they straddle so much of popular culture these days ranging from those dealing with the public such as 'The X Factor' and 'Britain's Got Talent', through the fate of 'celebrities' in 'Strictly Come Dancing' and 'I'm A Celebrity Get Me Out Of Here' to less entertainment orientated ones, notably 'The Apprentice' and 'Dragons' Den' in which individuals pitch themselves or their products in front of a panel of 'experts' to receive their approval or dismissal. Even in the programmes with a popular vote, the panel often leads the way. Of course, there have been game shows going back to 'Fifteen to One' in which contestants selected who among them would be asked the next question, evolving into more aggressive and deceptive behaviour for programmes like 'Golden Balls' and the short-lived 'Shafted'. Of course, the one that has been most enduring has been 'The Weakest Link' in which contestants get to vote off others usually on a tactical basis to make their path to the end easiest. The dismissal is delivered as cruelly as possible by an intentionally austere presenter. People, this is a game show not real life. Recruitment should not be debased this way; I am not an actor or musician.

As I say, I do not enjoy these programmes and bar a couple of programmes of 'The Weakest Link' and 'Golden Balls' have never watched any of them. However, I am aware of them and how the tactics operate as well as any ardent fan. They have so penetrated British society that knowledge of the methods and the tactics is commonplace. To my alarm, it now seems to be creeping into the corporate world and shaping how recruitment is run. For my next interview I have to appear before a panel of 15, some of whom will not be knowledgeable about the topic I have to speak about. I have 10 minutes to present on a detailed topic and then there are 5 minutes of questions. The use of presentations before an interview is very common for interviews these days, however, in every other case, they are used as a springboard for questions in the interview, not as a method to sift out candidates without really getting to know them. I suppose I would have been happier if they had said they would let us know if we were through to Stage 2 an hour later, but they emphasised the immediacy of the decision.

It is interesting to note that this is my second interview this year for which I have been told there will be no data projector for Powerpoint slides nor an overhead projector. This is a vast change from back in the early 2000s when I was rejected immediately at an interview because I did not use Powerpoint slides in my presentation. Partly it comes from the ignorance of how this technology works. One interview I attended last month was delayed an hour because no-one on the panel could operate the projector equipment or laptop to deliver Powerpoint slides. Partly it is because it is now seem as hip and bullish to put aside the 'crutch' of the Powerpoint slides. Ironically, that puts people like me, in the business when there was one terminal in an office, in an advantageous position over applicants who have been doing multi-media presentations since primary school.

What is jarring about this upcoming interview is that after my 15 minutes in the spotlight I will be told if I have got through to the next stage, i.e. the interview. Quite easily I could have driven 240Km and stayed overnight simply to have 15 minutes of performance and then come home. Of course, I cannot travel by train as I have no idea whether I will have to leave sometime in the middle of the morning or hang around for an interview at 5pm. The interview schedule will only be outlined after the performance stage. This clashes with the expectations of most companies that you book train tickets well in advance and at the cheapest price, and in the UK today, that means sticking to a set scheduled train. I have no idea what the interviewers can learn in 15 minutes, especially if the bulk of the audience are not knowledgeable about the areas the job covers. I can understand you have to make a good show to customers and prospective clients, but this system seems to be more beauty contest than even talent show. I can envisage the audience holding up scoring cards or simply a tick or a cross to say whether you go through to the next stage.

Interviews, as I have noted before, are getting shorter. They are about 60% the duration of what they were when I was looking for work in the early 2000s but the specifications are increasingly complex and often straddle many different roles. Perhaps I am too old to respond to the type of interviewing which demands that you snatch the attention of the 'audience' in an instant and somehow communicate vast areas of knowledge in a handful of minutes. If I was not so desperate for work, I would consider turning down this call for interview. I am not keen to encourage this methodology to spread, however often it is used in popular culture. I think it will do a disservice to the companies who use this methodology and I wonder how many companies will still be employing any person they recruited this way, 12 months down the line. Recruitment is a slow process and needs to be thorough. Adopting flashy approaches will not secure you the ideal candidate and it simply encourages behaviour and methods that get us even further away from the type of skills necessary for most business. I accept that for sales and even marketing it might be appropriate, but who ever insisted that the stock controller or the quality manager had to be able to sing karaoke well to work effectively in those posts?


P.P. 29/07/2009: I was persuaded by friends to withdraw from the process. As they pointed out I would be travelling 480Km with a good chance that I would not even be interviewed. I would have had to prepare for Stage 2 even if I did not reach it. Even if I got through to Stage 2 there was no indication at this stage of the schedule, making booking train tickets impossible. I wrote a strongly worded email to the company saying I was not going to go through their version of 'The X Factor' and will spend my time applying for jobs with companies who do not indulge in this kind of false machismo. I have reached a certain level in my career and expect some degree of respect, but I hope that they stop treating any candidates this way. Just because there is high unemployment should not mean we have to be performing monkeys to get a job.

Another thing that the company seems to have forgotten is that these days interviews are strictly based on criteria and if they refuse to give someone a job on spurious grounds they can be called to account for it. There is no indication what criteria they are using to judge people at this presentation, but given the immediacy of the response it must be on first impressions. They have to be careful here, because first impressions is the basis on which disabled people and other minorities will lose out to average interview panel and these people rejected on wishy-washy reasons will come down heavily and with legal backing on employers.

I just hope this employer is an aberration and that from now on I will encounter only rigorous and rational interview processes. However, I fear the approaches of entertainment have now contaminated what is a far more serious activity.

Friday, 24 July 2009

In Interviews Say As Little As Possible

Having just failed to get a job from the seventh interview I have done in the past 3.5 months. I have learned something to add to the things I have written here before about the way you have to twist and turn in order to get recruited. One problem that this latest interview has re-emphasised once again is how little departments know what the human resources department is doing or how it approaches recruitment. I have applied to over twenty companies in recent months and the forms you have to complete and the information they want varies considerably. The most involved required me to present copies of my 'O' level certificates (examinations I took 1983-4) along with the originals. At the interview a man compared the copies and the originals and signed the copies. These are apparently kept on file for a year even though I was not offered the job.


In the latest interview one of the interviewers at the time and in feedback asked about my work pre-2000 though there was no room on the application form to include that. She said she always wants to know about 'gaps' in people's CVs especially as she could find no declaration that I had no criminal record. Of course, she was oblivious to the fact that, at her company, the human resources department gets you to post in that declaration in advance and I had done just that. She was bewildered by the application form I had completed despite it being the one on the company's website. This was from a company that was far better organised than ones that have interviewed me recently.  Even then, though, I felt that I could not be judged properly if the interviewer was getting different things to what she expected. This time it was particularly depressing as I was told I had failed, not because I did not have the range of skills they wanted, but because of that interview.


My key problems these days in getting a job are either that I have not had four careers in a range of different fields including human resources, product quality, strategic planning, market research and marketing or that I seem too relaxed in interviews. I cannot do anything much about the former. Even if I went back in time and directed my younger self I could not go up all of those career paths without holding down multiple jobs.  As it was, I had 4 part-time jobs simultaneously for 2 years of my life, though clearly not in sufficiently diverse settings.


The attitude of interviewers is clearly something I should be able to alter which is why I feel so bad about making the mistakes I did this time. I am just about to sign on unemployed and getting a job now would have meant me avoiding the 'gaps' in my CV that people will pick up on in the future. My mistake in 35 minutes could have altered my future greatly. I still anticipate a long period of unemployment and having my house repossessed but it seems with a slightly different approach I may have been able to snatch myself out of that situation at the last moment.


I have done scores of interviews in my life for courses and for jobs. I always do a lot of preparation before I go into an interview and find out all I can not just about the company and its attitudes, but as much as I can about the members of the interview panel themselves. As a consequence I am not intimidated when I go into interviews and am often far more relaxed than the interviewers, many of whom are amateurs at recruiting, have had no training in how to interview, and as noted above, are often ill-informed even about how their company conducts recruitment. I always score well for my presentations, interviewer after interviewer has said I do that very well. It is when the questions start that I am clearly blundering.


The mistake I made at this latest interview was to assume that the interviewers would drive the interview. They complained at the end and in feedback that they had not had time to address the questions that they had wanted to cover. I thought: 'well, whose fault is that?' It is not me who sets the agenda of the interview. I respond to the questions as asked and try to give as thorough a response as I can.

I was bemused that they asked in depth about the structure of the company I am being made redundant from and also about my jobs pre-2000. Neither of those things are really a basis on which to judge my qualities now. Apparently, though, I was supposed to be dismissive of these things and focus on me. I did try to get them back to me here and now, but that was apparently not the right approach either.  In their view I should simply have given terse responses allowing them to get on to the questions that they felt were more important. As we apparently had no time for these, they said they could not judge me properly and so I stood no chance of getting the job.


There is another issue here: how short interviews are becoming these days. Back in the 1990s, 45 minutes was seen as normal and there was slack for it to run to 55-60 minutes if necessary. I did have a 75-minute interview recently, but that was because I was the only candidate and the interviewers seemed to feel they had to get a whole long list of standard interview questions in as well as talking about the specifics of the posts. These days for interviews, 15-30 minutes is the norm and there is someone waiting outside to follow in immediately. Thus, if they do not get to the questions of importance you fail by being 'timed out'. There is no sense that the interview should or could continue.

My mistake is that I feel people want a genuine answer in which I can show the breadth of my knowledge and experience, whereas in fact they want a terse response to enable them to tick a box. I knew that kind of interview back in the early 1990s when in the civil service, but it was more apparent that it was mechanical in that way. I guess, given the inexperience of so many interviewers, it is a mistake to try to judge the kind of interview you are in from how the interviewers behave.

Obviously, I need to learn the trick of getting the interviewers to focus back on me as I am now, yet without seeming to take over the interview. I need to get this focus across very tightly despite the fact that at the level I have been working, it is difficult to explain a complex project that I have worked on in a few words, but clearly it is what I need to learn. However, now it might be too late. As the days pass and I step further and further into a bleaker future, failing to kill myself last August seems daily to have been the greatest mistake.

Wednesday, 22 July 2009

The Mutation of the English Language

Note that this post is not called the mutilation of the English language. As it is, English comes in quite a few varieties and even back in the 1990s I started seeing 'American' phrasebooks, so titled, in use in France and Germany. Having worked with South Africans over the past few years, I have seen the intricacies of an English with both British and American input. Though they number the floors the way the British do, i.e. ground floor as opposed to first floor being the lowest, they refer to 'pants' to mean trousers rather than underpants. Of course they mix in some of their own like 'conscionise' which I have heard nowhere else, unlike either the British or Americans they measure distance in kilometres and as for 'stay' and 'live' as in 'where are you staying/living?' the meaning is exactly reverse of what it is in Britain. The Americans had the input of Noah Webster (1758-1843) who through his 'An American Dictionary of the English Language' (1828) moved to USA to spelling harbour as 'harbor' and so on. There are differences in grammar too, for example, in the USA you 'write your father' whereas in Britain you 'write to your father'. In American the 'building is named for Abraham Lincoln' whereas in Britain the 'building is named after Abraham Lincoln' and so on. The Americans still have a Simplfied Spelling Society: http://www.spellingsociety.org/journals/j9/american.php with some sensible suggestions for easing the confusion in the English language further.


Having a seven year old living in my house I know how mad the English language can appear to newcomers. Just this morning there was a mix up over 'paws', 'pores' and 'pause', which in the dialect (middle class, southern England English) I speak, are all pronounced the same. Given that English is the second most spoken language in the world, it seems mad that we had adhered to all the complexities of this language and not made it a bit easier not just for foreigners but also for our own children. A lot of it comes out of the way English sounds, setting aside dialects, even in plain English you can 'fear', 'pier', 'weir', 'here', 'cheer', 'kir', 'Kia' all rhyme, whereas, for example 'schliessen' could never rhyme with 'schreiben' and most languages are the same.

The problem for the English is that they had too many inputs. We had the roots of what became Danish and Swedish, German and Dutch as well as French and Latin input too, and mixed in words from Hindi such as jodphur and bungalow. You could argue that this makes the language dynamic and so easier to use in the modern world. This can be compared with the challenges of say, the Vatican and the Universities of Oxford and Cambridge, in finding Latin words to describe modern technology, or more often, those who oversee the Welsh language to create vocabulary for such innovations without simply taking the English word and pronouncing it in a slightly different way or altering the letters. The Welsh word for Pakistani is Pakistanii; the Welsh word for tyre (tire in American) is teiar, which in my view is pronounced the same. However, it means that to learn English you have to learn loads of exceptions and it has recently been declared that there is no point in learning spelling rules for English because there is so much variety.


Anyway, whatever learned scholars and politicians might suggest about altering English, it is changing while we sit here arguing. A lot of this stems for technology. Playing online games, as I have commented before, writing the way I do here, I use a very different language to that of many others. 'Lol' seems to have become a word now (steming from Laugh Out Loud) and there are less common ones like IMHO (In My Humble Opinion which sounds likes Dickensesque language brought into the 21st century). Of course this stems from speeding up the transmission of messages when texting or typing. Some are simple contractions, such as 'soz' for 'sorry' and 'luv' which is an old contraction from way back. I find the adoption of 'u' for 'you' interesting as u is the Dutch word for you and for a few shifts in history we could be using that. Such contractions even begin to evolve their own grammar, so 'ur' stems from 'you' and means 'your', as in the Pink album title, 'U + Ur Hand'. Of course not all texting is about words and even the oldest mobile phone users will recognise :-) and many ;-) though XD might be more of a challenge. So, for many, symbols are now mixed in with words, taking us back to a semi-hieroglyphic form of English.


Young people have always sought to use slang and even code languages both to mark them out from parents and to have a sense of belonging with others in their peer group. Pig Latin, is probably the most enduring one, but there are thousands of local varieties. Such a language was an element in an episode of 'Wallander', the Swedish detective series, I was watching recently, giving a little challenge to the subtitler. However, I believe that the current mutation of the English spoken in Britain is going deeper than that. In some ways it is based on social class with adult working class people adopting the text speak that middle class adults leave to their children. We are not at the stage of Imperial Russia in which the nobles spoke French and only the peasants Russian, but you can mark yourself out in British society by the language you use. Of course, this has always been a form of segregation. My father always quoted sayings about how an Englishman defines himself by the way he speaks. There is RP - Received Pronunciation, the upper class, South-East England accent which for decades was all that was permitted to be heard on British broadcasts. In more democratic times things have shifted and a few years back it was noted how even the Queen had moved towards so-called 'Estuary English', stemming from the cadences of Essex, where the estuary of the River Thames is located.


Having spoken to a woman back in the mid-1990s who was researching the impact of Arabic on French used in urban France, it is noticeable that British English is not mutating that way. We seem to have had our influx of Hindi words more than a century ago and there is no flow now of Urdu or Arabic or Polish phrases, perhaps because Britain has always had such a diversity of immigrants and not concentrated from a particular region as happens in France due to its intimate connection with North Africa. However, in the UK in our changing language we are now moving beyond pronunciation to grammar shifts. These seem prompted not just by simplification of the language for ease of use in speech and technology, but because errors are beginning to stick and become what is now seen as the proper approach.


One that has been developing now for quite a while, is 'I could of won the match if I had tried harder'. In this sentence it should be 'I could have won ...' with have defining the tense and the emphasis of the verb. This has come about simply because of and have, let alone 'ave sound so similar that people mixe them up. Now they write that way. The sense is not really lost but there is no way to explain why you would use 'of' at all in the sentence. Less tortuous, but perhaps bizarre is the appearance of what was once called the 'grocers' apostrophe'. The fact that could have been 'grocers apostrophe' or 'grocer's apostrophe' begins to show some of the difficulty. I was told at school that in Tudor times a man would write: 'Henry Brown, his book' and over time this evolved into 'Henry Brown's book' with the apostrophe after Brown showing the dropping of the 'hi' from his. I am sceptical of this explanation partly because I learnt German and in their possessive cases they often have -s on the end of a possessive noun or adjective. It also does not explain why it never became 'Elizabeth Tudor'r book' for women. I can understand that over time, to speed things up, people would drop the apostrophe and say 'Henry Browns book', but whilst this has happened in many cases of popular usage, the application of the apostrophe has now been added to plurals instead. This brings us back to the grocers, because it was first seen in signs advertising egg's, cheese's, cauliflower's and so on. In some cases, notably, potato's or tomato's it could be acceptable to point to the missing 'e' though that was not the intention. I realised that this unneccessary use of apostrophes in plurals had come into mainstream usage in 1998 when, in the Cabinet Office, in London I came across a report on the work of different government departments. It was a very glossy publication which would have cost you £50 if you had chosen to buy it and there I saw such apostropheed plurals. If it was being used at the highest government level then the rest of us need to get in line. The clearest swap has come in the anomaly that is it's and its. Its is like his and hers, it is a possessive pronoun, but of course, in the past people expected possessives to be made with a 's as in Henry's book. So, it was very common for them to use it's instead. However, it's is in fact a contraction of 'it is', with the apostrophe showing where the 'i' once was. With the disappearance of apostrophes to show missing letters and didnt, cant, etc. becoming words in their own right the switch of its and it's is now complete.


The problem, of course, is in contrast to France and Germany where the government and its appointed bodies make declarations on language usage, whether you can have three 's' in the middle of a German word and how much music should be sung in French on the radio no-one in Britain makes any ruling. This is where the key problem, confusion. If I read 'the girl's coughing' now, is it about the coughing of one girl; a joint activity by a number of girls or is it about the very immediate occurrence, i.e. 'the girl is coughing' with the 'i' simply dropped rather than something more continuous? Of course, a lot of language is contextual and that is possibly why the class issue creeps in. When working at a warehouse a man once told me, in a calm tone, that he was going to 'fucking fuck that fuck' which meant he was going to unload his lorry. Of course, that phrase in anger might have meant he was going to beat up someone or he was going to win a race or a score of other issues. However, standing by the lorry he had driven in, the context made more or less it clear, though still he might have set off to beat up someone who had cut him up or something and used the same phrase. Communication, though is not about the instance, it is about a broader usage. While we may use it to emphasise the exclusivity of our group, it is actually more vital in allowing us to get comprehension from those beyond our own group. While we get by day-to-day, there must be thousands of cases when clarity was needed and instead there was confusion. I am not saying we should or can stop English evolving, but once in a while we need someone to clarify what is the baseline. I never liked American spelling, but I now feel that simplifying English, especially now that so much of it is taught through phonics rather than visually, could only help not just newcomers to the language but us who have been using it for decades and want to ensure we are understood and can understand what is being written to us. In addition, there may be other benefits, because it is often argued that the British are so poor with other languages because they find their own one so difficult.


P.P. 24/07/2009: One thing I neglected to mention is the disappearance of silent letters from writing English. To some extent, I suppose this is overdue, the most common examples I have seen recently are 'wich' rather than 'which' and 'wat' in the place of 'what'. Of course 'wot' for 'what' has been a common element of the grafittier's vocabulary for many decades, but 'wat' seems a bit more elegant and has the example of Wat Tyler, though in this case it seems to be a variant, perhaps a contraction of Walter and ironically means 'commander of the army', so in fact in the context of Wat Tyler (1341-81; full name Walter) a leader of the Peasants' Revolt of England in 1381, it might have been a title rather than his real name.

Friday, 10 July 2009

Kurt Wallander and Perceptions of Sweden

It is interesting how people's views of a place are often defined by fictional crime stories set in those locations. I think this is particularly the case when those stories become television series. In the UK I think in particular of the Oxford shown in the 'Morse' series (1987-2000 based on novels by Colin Dexter published 1975-99) and the 'Bergerac' series (1981-91) set on the island of Jersey. Often, in fact locations featured are not in the locations they are supposed to be. This was particularly the case in the middle episodes of 'Morse' in which St. Albans which is far cheaper to film in than Oxford, often stood in for that city. Of course it is not just detective stories that have this impact I have commented before about the veterinarian stories of James Herriot and how they attracted fans to the Yorkshire Dales and the Jane Austen craze since the 1990s has had a similar impact on certain locations in England.
 Sometimes a television series has to go right outside the real-life setting in the novels to show appropriate locations, primarily because the time that has passed since the novels were set. I have recently been watching the 1992-3 British series of 'Maigret' (based on novels and short stories written by Georges Simenon between 1931-72), starring Michael Gambon as the eponymous detective. As with the David Suchet 'Poirot' series and the Joan Hickson 'Miss Marple' series on British television, rather than straddling the decades as these characters did in the novels, the makers select a decade that they feel best suits the detective's manner. Poirot has been allocated to the 1930s and Maigret, like Marple, has been put into the 1950s. To reproduce 1950s Paris in 1990s Paris, in fact, in the bulk of 1990s France, would be impossible, so locations in Hungary were used. Yet, watching the series you feel they have reproduced the era and its French settings perfectly.

Anyway, place is important for these series and this contrasts with other detective series such as 'A Touch of Frost' (1992-2009) which has a very uncertain setting, sometimes seeming to be located in the Thames Valley, sometimes somewhere in northern England instead. This is how I come to my perception of south-eastern Sweden and how it has been shaped by stories featuring the Swedish detective Kurt Wallander. The novels of Henning Mankell featuring Wallander were published 1991-9 though another is due for publication this year, and another featuring Kurt playing a secondary role to his detective daughter, Linda, appeared in 2002. Though unlike Morse, Wallander has been married, his manner is similar. In many of the stories he eats poorly, exercises little and drinks too much alcohol. Whilst he can get inside people's minds and be sympathetic to the ordinary people he is involved in cases with, he is also pretty socially dysfunctional especially with colleagues and his daughter. He also has a bad relationship with his artist father, though in turn he is not an easy man to deal with. The Wallander stories do not pull punches and the murders that he investigates are often brutal and stem from unpleasant occurrences and lives. There is also often a political element involved too.
 The Wallander series was translated into English 1997-2008, but it was the fifth novel in the series, 'Sidetracked' (1995; translated 1999) which really broke through into the UK market and this is why it was the first one to be made into an English-language production. In recent months I have seen episodes from both the Swedish language television series starring Krister Henriksson from 2005-6 (there were previous/concurrent Swedish movies 1994-2007 starring Rolf LassgĂ„rd) and the English-language television series of this year starring Kenneth Branagh. Many of the Henriksson episodes are stories written for television rather than being based on original novels. This happened with the Morse series too. With a television series usually with a minimum of 4 episodes, but often 6 or 13 (representing an eighth or a quarter of the year) there is a need for a lot of stories and the television producers get through them quickly. It is only when you have so many as in the case of the Sherlock Holmes, Maigret, Marple and Poirot stories, written over decades, that you are unlikely to run out. They made all but 17 of the original 60 Sherlock Holmes stories with Jeremy Brett as Holmes, 1984-94. Similarly Derek Jacobi featured in 13 episodes of 'Cadfael', across four series, based on the 20 novels and one collection of short stories published 1977-94 by Ellis Peters. So, it seems that to keep a series satisfied and author needs at least 20-40 stories and so far Henriksson is already slated to appear in 26.

In my general discussion of television detectives I have wandered from my key point which is about the perception of Sweden which has been thrust upon me by the two Wallander series I have seen episodes from. Of course, series often act as travelogues for the region they are showing. People easily write off the murderous aspect. 'Bergerac' had 87 episodes, now not all of the stories featured a murder, but some had more than one, so let us say it showed 87 deaths over a 10 year period. In 2001, Jersey had a population of just over 87,000 people, plus of course it has thousands of visitors, but it would have been a pretty high murder rate, way above the actual level for the island. It clearly did not impinge on tourists going there and in fact probably helped contribute to the numbers by showing how nice the place is.

The Wallander stories are located near Ystad, probably the most southerly town in Sweden with a population of only 17,200 people. So it is a rural area though with ferry and train connections to Denmark and ferries to Poland and Estonia too. However, its level of preservation is very high and it looks historic and picturesque. I suppose this is like Colin Dexter using Oxford as the backdrop for his Morse stories. The way it is shown in the series is as an almost unpopulated area, though SkĂ„ne County has a population of 1.2 million and though it only covers 3% of Sweden, contains 13% of its population. The skies are big and you feel a similarity with the northern states of the USA. The filming is often done in an almost under-exposed way, especially in the British version, to emphasise the length of the Summer days and the purity of the light in the region. With the lack of people shown it almost gives it an ethereal feel. Given that Kurt Wallander's father is supposed to have painted the same landscape 7000 times it fits in with the other-worldliness of the location, perhaps as a counterpoint to the brutal murders which Wallander and his daughter investigate. Perhaps boredom is a motive as characters are shown having perverse sex lives, torturing animals and getting involved in religious or political fanaticism.

Another counterpoint is between the beauty of the landscape and how dull all the characters' homes appear to be. They look as if they are living in East German barracks at the height of the Cold War. I am surprised that Ikea, the Swedish furniture company has not stepped in to try and alter this portrayal of Swedish interiors as being so dull. Despite all the wide open spaces portrayed and the outdoor lifestyle you come away from watching this programmes feeling that everything is very stifled, claustrophobic.
In some ways watching series featuring Kurt Wallander, I find them as almost as uncomfortably dreary as the settings of the series 'Supernatural' (running since 2005, currently the 5th series has been commissioned). This is a US series about two young men who travel through small-town USA fighting demons, ghosts and other supernatural creatures. I find it unnerving, not because of the horror aspect, but because each week they are in yet another dreary dead-end, one-horse town in the USA and even those people not experiencing supernatural events are facing the bleakest lives possible. Ironically, like many US series, it is actually filmed in western Canada.

Even the police in the Wallander series seem housed in something resembling a community centre which is constantly being reorganised. Liberals in Britain have often pointed to Sweden as being a model society, but perhaps as in all model societies it is a very dull society. Women are shown as playing an equal role. This is something that is striking if you access websites of companies and public institutions of Sweden and neighbouring Scandinavian states, that you see women in prominent positions, in equal quantities to their male counterparts, sometimes in the majority. In the UK we may pay lip service to equality, but it only takes some moments looking at Swedish counterpart comapnies to see it is a reality there. Taken as a whole, however, the Wallander television series challenge the 'sexiness' people in UK perceive in Sweden and instead show the country as much screwed up as Britain is, and perhaps even worse, because when it is not dysfunctional it is tedious and is a country that will drive you mad through its bland nature. Of course I will continue to watch because I am always interested by crime stories set in different times and places, but I do wonder if the Swedish tourist board should worry about the international success of the Kurt Wallander stories on television and in movies. They do show a beautiful landscape, but very convincingly a country you could not stand to be for more than five minutes without falling into utter despondency.

Thursday, 9 July 2009

Why I Believe Lance Armstrong is in the 2009 Tour de France

Many people have asked why Lance Armstrong, Texan seven-times consecutive winner of the Tour de France decided to come back and race this year not having been in the competition since 2005. Armstrong dominated the race 1999-2005 and at times had a cool demeanour, partly because as he became more successful he faced questions from the media, especially in France, who felt that he could only have succeeded with the use of drugs. You have to remember that in this era leading racers in the competition were eliminated when it was found they had been taking drugs. However, despite all the numerous tests Armstrong came through that with a clean bill of health. Armstrong is also renowned for the fact that he survived an aggressive cancer which started in his testicles but spread to his lungs and brain. He has established the LiveStrong charity and a foundation to help people with cancer. When Armstrong, aged 37, returned to the Tour de France this year on the Astana team based in Kazakhstan, people were surprised. He has the second position on the team to Alberto Contador, the winner of the 2007 Tour de France, seen as a very possible candidate for victory in 2009. Contador was unable to ride in the 2008 Tour de France because Astana were blocked due to connections of the former team management and cyclists from the team with drugs, notably the team leader Alexandre Vinokourov ejected from the 2007 race for doping.

Now, it is very unlikely that anyone will ever equal let alone surpass Armstrong's record in the Tour de France and many people worried that coming back after 4 years away from the race and 3.5 years out of profiessional cycling he would end up humiliating himself. Given by the end of the fifth stage he was 0.22 seconds away from holding the leader's yellow jersey, I think has blown away any such concerns. Perhaps 29th in the Tour Down Under, 7th in Tour of California and 2nd in the Tour of the Gila with a 3rd place in the Team Time Trial for Astana in the Giro d'Italia were hardly astounding results at the highest level, but they show that Armstrong is not unfit. The clear factor as seen on Stage 3 of the Tour de France was that Armstrong is experienced. He could always read the field well and the tactics he pulled in the mountains in the years of his victories were as vital as his stamina. This is what made Armstrong, though at times rather irritable (though we had not experienced Cadel Evans taking that to a new height then), was he was far more exciting to watch than the morose Miguel Indurain, winner of the Tour 1991-5 who just hammered out a pace. The break in the field on Stage 3 was not expected, but Armstrong saw it and went with it; Contador, perhaps lacks the experience, or maybe the sixth sense, that Armstrong has.

One interesting fact is, that with seven of the nine Astana riders currently in the top 10 of the riders in the race, how much less success my least favourite rider, Cadel Evans, would have had last year when Astana, particularly Contador was absent. I think Evans simply got lucky. I was glad he did not win last year, but I think that he should not even have got as close as he did. The man is incredibly self-centred and arrogant and gives professional cyclists a bad name. I know they can be terse but in his desire to be a diva he has stepped into the realm of rude and has put himself on the same level as the worst of boxers and football players. He is rapidly disappearing from the upper levels of the General Classification and in my view, all for the good. I trust this year with the race back to a proper standard, he will not come close to the 2nd position he held in 2007 and 2008.

To some extent I feel Armstrong has been sincere he saying he did not come to the Tour this year to win. I feel he has been surprised by his success, but now senses it might be possible to win again. However, we have yet to see the mountain stages in which Contador excels. Hopefully he will listen to Armstrong about playing the field and do so in the way that allowed Armstrong to blow away his close opponents in previous years. The thing that I believe that brought Armstrong to this race was not the chance of another win, but sheer enjoyment. I am sure that he is pleased that Swiss rider Fabian Cancellara is wearing the yellow jersey still. Obvously riders will keep an eye on Armstrong but not to the extent that they would have done if he had also been in yellow. What brings us back to watching the Tour year after year, people standing by the roadside for hours, waiting for days for a few minutes of a passing cavalcade? It is enjoyment of the event and the Tour de France is the most prestigious of cycling's events. I think Armstrong enjoys simply taking part, riding those roads, being mixed up in all of day's events. Though not all the pressure is off him, and he is getting more than he perhaps might have hoped for, it is certainly less than when he was winning yellow year after year, so he can enjoy the event to a much greater extent and share his vast expertise with the next generation like Contador, only 26.

I think we can see something that there was a similar motive for German rider, Erik Zabel, a sprinter who was still riding in last year's Tour de France at the age of 38, having won the green points jersey every year 1996-2001 and by 2008 still coming 3rd in that competition. Zabel is now a technical advisor to Team Columbia HTC on which British sprint marvel, Mark Cavendish (who has already won 2 stages this year and managed 4 in total last year) rides. These men love the sport and are going to be at the heart of it as long as they can. For Armstrong it is not a sad swansong, rather he can participate in an event he loves without everyone spoiling it for him by targeting him day after day. Well, that was the theory. His experience and clear enduring strength may make things more complex for him in the next fortnight.

Wednesday, 8 July 2009

Timewarp Britain: Summer FĂȘtes

Given that the term 'fĂȘte' comes from French it is interesting how it has become such a British institution. Of course in England it is pronounced as 'fate' as opposed to the French pronunciation 'fett'. In France it sums up something rather like what we term as a carnival, a town-wide event. I know it is dated, but in my mind it will always be encapsulated in the movie 'Jour de FĂȘte' (1949) starring comic actor Jacques Tati. It is available to watch for free in 12 parts on YouTube: http://www.youtube.com/watch?v=O75gUQmwdTM&feature=related there are black and white, partially coloured and colour versions.

Anyway, the British fĂȘte is something different, usually confined to a single location such as a school or a church; larger events are deemed at least 'parish days' or these days more in favour is the term 'carnival' which often has a procession leading up to it and also tends to have fairground attractions as well as the more low key stalls. The county fairs like the one in Hampshire I comment on last summer, are of a far larger scale. The objective of fĂȘtes is fund raising and this is done in a range of generally traditional ways. In part the lack of change at fĂȘtes is an endearing element of them. Aside from the bouncy castle and the face painting an organiser of a British summer fĂȘte of 1949 would not feel out of place at one in 2009.


I was a big fan of fĂȘtes in my youth and worked at many as my parents were active members of the PTA (Parent-Teacher Association) of the schools I attended, the bodies behind such events at schools. FĂȘtes need a reasonable amount of space and sufficient people who can spend a decent amount of money playing on simple games and buying cakes. They are frivolous events and I guess this is why I never saw any occurring in East London where there was a lack of space to host such things and the population lacked the spare cash. Interestingly they did not seem to even occur in Milton Keynes, though it has a great deal more space. Maybe that says something about the nature of the population or the fact that the city is so thinly spread. The natural home for the fĂȘte seems to be from large village size up to leafy suburb and they are less common in really rural areas or inner cities.

In the past fortnight I have attended two fĂȘtes, one at a church, one at a school. Though every fĂȘte is unique there are many common elements. It is usually 'opened' by a notable, this can be anyone from the local vicar or school head to a celebrity of various degrees of stature. Sometimes you get something slightly different. I attended a fĂȘte opened by a real owl and another opened by Queen Elizabeth I (in fact a teacher dressed as her). Maybe with our perception of celebrity changing with so many 'reality' and talent shows, this is becoming a less common element than it once was than in my youth when I attended fĂȘtes opened by people like DJ Ed Stewart and actors Bernard Cribbens and Buster Merryfield

The backbone of the fĂȘte is that there will be a number of stalls, some selling items and some with games. The items tend to be of a particular nature such as 'white elephant' or 'bric-a-brac' (wonderful English phrases in themselves), i.e. second hand ornaments. There may be some second hand clothes, but these tend to be reserved for jumble sales, which are usually held in the Autumn and Winter rather than the Summer. There will be a stall selling homemade and often bought cakes as well. One noticeable thing over my youth is that the people running these stalls now all wear plastic gloves and all the ingredients are listed on each homemade cake so that the event does not get sued from someone with a nut allergy. They often sell cupcakes (also known in the UK as 'fairy' cakes) and classic home-made recipes such as chocolate-cornflake or chocolate rice-crispy cakes. There is usually a lot of heavy fruit cakes, some sponges and sometimes some nice banana or coconut cake, always a good bet as they tend to be lighter.

In addition to the cake stand for cakes to take home, there will be 'refreshments', usually tea and cake with orange cordial for the children, sometimes crisps are available as well. This tends to be run from a large tent or the hall of the school or church. Invariably these are served on institutional crockery, if you are lucky, Beryl ware. Summer fĂȘtes are distinguished from their Winter equivalent or jumble sales by having a barbecue, usually of burgers and hotdogs. The vegetarian option has now creapt in here. However, allocating the task of cooking the food still seems to be allocated to any black or Hispanic man who can be found in the neighbourhood, with pale white women serving the customers. There is clearly some racial stereotyping going on. In our district white South Africans tend to be the best barbecuers but the job is given to a black man whose moved from Birmingham. You can read a lot about a neighbourhood from its fĂȘtes. I must say the quality of the burgers has risen over the years, I had delicious homemade ones at the church; onions seem to be less burnt than they used to and the rolls, less doughy. I do not know if my tastes have deteriorated since the 1970s, but quality in fĂȘte barbecue food seems to be on the rise, perhaps it is the Jamie Oliver effect.

Other stalls sell things like pot plants, old children's toys and second hand books. What struck me recently was how little prices have risen. I was still paying 10p (€0.12; US$0.16) for a second hand book, the same as I did in 1978, although in today's values 10p then is worth about 50p now. This seems made as costs have risen since then. In that time, cakes seem to have gone from £1 (equivalent to around £5 today) to £4 so seem to have kept up better with inflation than books and toys. The only time I remember someone breaking from this was a woman who took over running the second hand book stall at the local Scout jumble sale to where I lived. I went for many years and each book was meticulously priced, with different prices rather than the blanket price most places charge. This woman charged prices by assessing the wealth of the potential buyer. It was incredible how much resentment this caused and led to people abandoning piles of books. She did not realise that even when at a charitable event, people want to know what they are letting themselves in for (they may be saving their change for a cake or a cup of tea) and to have an arbitrary figure imposed not only lost her sales but meant regulars not coming again. People can tolerate higher prices than 10p, but they need to know what the prices are.

Aside from the stalls selling items, there are the games. These are usually old fashioned and very low key. One universal one is the tombola. Here again prices have not risen much and I got 5 tickets for £1. You open the ticket and see if it matches one of the ones on an item on the able, usually numbers ending in a 5 or a 0, so getting a 1 in 5 chance of the prize. The 7-year old boy from my house won 3 out of 5 at both events, and began assuming he would always win. I took him back so that he would lose more and better understand the odds, but then feared that I had unleashed a gambling monster as he said 'if I just buy a few more tickets, then I'll win again'. Having rarely won anything in years of tombolas, I knew that way leads to poverty. Sometimes there are specific bottle or cuddly toy tombolas, most have a random selection of items. The boy from the two fĂȘtes won a selection that probably sums up the kind of items you get: an old, though unused, radio; a jar of tomato relish; a tin of tuna; a teddy bear; a small candlestick and a large pot of scented moisturiser. Other games include the coconut shy, probably the only time in the year most people get close to an unprocessed cocount, throw wet sponges at someone often a teacher, smash crockery (this is where I saw a Beryl ware plate selling for £2-3 on eBay smashed), roll the 2p across a board - where it lands may result in a prize, spin the wheel, bash the rat, pick a nail, the raffle, racing (battery-powered) pigs (or their equivalent) and so on. Less active are guess the name of the bear/doll, weight of the cake, number of sweets in a jar and so on. There are sometimes innovations, I saw 'pan for gold' in which you had to sift wet sand to find little bits of metal, finding sufficient won you a prize. Back in the 1980s home computers briefly brought computer games to fĂȘtes, usually lovingly typed in some teenager and the one with a high score at the end of the day won a prize, but trying to shade the screen and now we all have computers, meant that innovation died. Getting a remote-controlled car around a course has been a bit more enduring and would be something the fĂȘte-goer of 1949 would not recognise, though the one of 1979 would.

So all of this is going on, you are losing money amiably or winning items you will donate to another charity event. The big innovations are the bouncy castle and face painting. I think face painting appeared at fĂȘtes when I was living in London in the 1990s and not attending them any longer. Yet it is now a stalwart element of these events and for the day you see boys with bat or tiger or spiderman faces, girls with butterly faces. I never understood the fascination but it is immense. There are books on doing it and a lot of effort is taken. I think this was best satirised in the second series of 'Phoenix Nights' (2001) which actually with the 'fun day' episode took off a lot about British fĂȘtes. Anyway, until someone stops face-painting on some health and safety grounds it looks like it is here to stay. The same can be said for the bouncy castle. I love the fact kids can literally go mad springing off the walls and floor for 15 minutes. I wish they had been invented in my day. Some usually collides with someone else and there are tears and I am sure soon, some parent will sue a school or a church and then children will only be allowed on them one at a time, harnessed to the side and wearing head protection.

The final element, aside from the uncertain weather which accompanies any Summer event in the UK, are the performances. I rarely pay much attention to these but they are an important part of the fĂȘte and must make many parents, teachers and instructors. Generally you get some kind of sports demonstration, something like a local martial arts group. There is usual musical input whether from a local marching band or brass band or from a school orchestra. There are often girls in leotards dancing or twirling batons. At large fĂȘtes there may be dog or motorcycle displays by the police or some local military unit. All of this is pretty unremarkable, but can be enjoyable for children who do not get to see much live entertainment these days. As a teenager interested in history and with a love of fĂȘtes, I used to particularly target those which featured historical reconstruction elements. There are huge numbers of historical reconstruction groups. In the 1970s the English Civil War was the popular focus, but these days there is a range from Roman and early Medieval through to Napoleonic. These events seem to be decreasing in number, again possibly due to health and safety concerns with all that black powder going off. I did see a group who did 18th century dance at one church fĂȘte three years ago, all in appopriate costume, the musicians too.

Sometimes there is an odd anomaly that is thrown into the mix which makes a fĂȘte different. I attended one near the Fulda Gap in what was then West Germany at which the future Chancellor, Helmut Kohl (born 1930), attended, with an amazingly low level of security. He sat at a table behind ours in the restaurant cutting up his elderly mother's fish. That was not a British fĂȘte, but I was just reminded of it. It had many of the same elements, though much more cultural clothing worn by the audience. The one that struck me the other day was at the church fĂȘte which the vicar's brother (both must be 1.95m and broad with it) performed songs in Spanish with a group consisting of a teenager on a keyboard and a late middle aged woman on percussion. Twice he sung a song about Che Guevara helping pull off the Cuban Revolution, which seemed pretty anomalous in an English churchyard. It was as if a Graham Greene novel had come to life. He lattely moved to 'La Bamba' and other more anondyne material, but it was a delightfully quirky element to the proceedings.

Though Summer fĂȘtes have evolved, it is a slow process and if you ever enjoyed such events before, then you will not be disappointed by the ones today. Take along a child to an event which is about human activity rather than what they can do electronically and they may actually find they enjoy it. For primary school children there is a lot of practice at handling money (the 7-year old with me always overpays 'because I get more change back' is his theory) and the stallholders are more patient than the average shopkeeper. I know Britain is embedded in its past which blinds it to much of contemporary life, but in my view the fĂȘte manages to connect past and present, and it is raising money for a charitable cause. May they long prosper and not be choked off with excessive health and safety concerns.

Monday, 6 July 2009

Online Behaviour: Greed and Need in The World of Warcraft

I have written previously about my lack of success in getting on one of the major social networking sites, Second Life, but it remains an area of interest to me. I was introduced to the World of Warcraft system by the woman who lives in my house. It has been running since 1994 and is what in the old days we would have called a MUD (Multi-User Dungeon) though the acronyms have grown since then and according to Wikipedia it is a MMORPG (Massively Multiplayer Online Role-Playing Game), which in itself sounds like a beast from a Tolkien novel. Currently it has over 11.5 million subscribers making it a virtual country with a population more than many in Europe. You cannot interact with all of those players, because there are a variety of servers, parallel versions of the world. In Europe I know there is a French and a German server for players from those locations; the British server also hosts players from the Netherlands, Scandinavia and it appears, Poland, but the language tends to be English with some Scandinavian dialogue. When I say English, in fact, unless you know text speak, it might as well be a foreign language. This is unsurprising as when battling a dragon it can be tough to write grammatically correct sentences at the same time.

As the titles Dungeon and RPG suggest, the game owes a lot to the paper-and-dice based role-playing games of the 1970s-80s though they are still played now, they are less popular than when I was young. Many of the type of people who would have been avid 'Dungeons & Dragons' (D&D) fans twenty-five years ago are now on World of Warcraft (WoW), and some of us from those days are in it too. The basic premise is that you explore ruins, caves, dungeons, castles and battle with fantastical creatures and aggressive people to steal gold and artefacts. As you fight and discover, you gain Experience Points and rise in levels so getting access to higher abilities and even better equipment. As in the RPGs of old, you play a role, hence the name. You name the character and select the class, the types in WoW are like those of D&D: spellcasters - mages, warlocks, shamans, priests and druids and other types such as warriors, rogues and hunters. Each has their own special skills and can use different weapons and armour. You can tailor you face, your skin tone and general appearance and even get shaves and haircuts in the game. The avatars of female characters tend to be elegant, sometimes even sexy and the male characters either robust or mean looking.

There are also professions. Your character can become a tailor or a leatherworker or a blacksmith or an engineer or an alchemist and so on. Some players ignore these skills, but there can be a satisfaction in getting a raw commodity and creating an impressive robe or potion and of course all of the output can be sold or used by players, for example, weapons, armour, health potions, etc. The level of technology is like that of the PC game, 'Arcanum of Steamworks and Magic Obscura' (2001). There is magic but there is also flintlock guns, steampunk motorbikes and even dynamite alongside the longbows, skeletal horses and fire spells. The continents are linked by Zeppelins but within a continent you can fly from town to town on the back of a giant bat or a manticore. Players can also learn cooking, fishing and first aid and it is fascinating how many different fish you can catch in the seas and lakes and the range of recipes available. Each food has different characteristics usually to help boost health or mana (spell energy).

There are races that you can play, very much in the Tolkienesque genre: humans, dwarfs, gnomes, night elves, blood elves, orcs, trolls and undead. There are two races not from that kind of background, the Draenei, huge humanoid aliens from a different planet to Azeroth where the game is set and the ones I find most imaginative, the Taurens, large bovine humanoids very much in the style of Amerindian culture. Again the races have different strengths and weaknesses. One interesting element is the culture as shown by their homes and their accents. As noted, the Tauren live in tepees in areas looking like the plains, the Rockies and the mesas of North America. The trolls are clearly influenced by Jamaican and other Caribbean cultures and they live in sunny tropical locations. Most comic are the goblins, who you cannot play, but turn up as traders and engineers across Azeroth and all speak in New York taxi driver argot. Anyway, some of this draws on stereotypes but does make an effort to move a little away from the standard swords-and-sorcery elements. Many of the creatures are out of a range of Western mythologies, so there are centaurs, manticores and things like giant spiders and scorpions as well as simply ferocious wild animals like rheas, lions, wolves, bears and gorillas. Dinosaurs also survive, some as mounts for player characters. There are raptors and herbivores (some with the ability to fire lightning bolts, so very fantastical). Some creatures, such as the razormanes, who are humanoid boar-like people seem unique to the game, though I do remember some similar race from the 'Runequest' RPG of the 1980s.

You can kill the monsters, beasts, people but they 're-seed' meaning that within some set time they will come alive again to enable another player to complete their mission by killing them or you to try again if you failed before. Sometimes the re-seeding happens very fast. I was looting the corpse of one opponent only to find him standing over his own body trying again to kill me. Even player characters never die entirely. You find your spirit at a nearby graveyard and either resurrect there or find your corpse and get back into it, with penalties in either case. Sometimes in a tough area you can resurrect again only to be killed almost immediately. Also some graveyards are far apart and you can spend an evening as a ghost running across the landscape constantly trying to get back to your body!

There are many NPCs (Non-Player Characters) in all sorts of forms. They act as traders, trainers in various skills and to assign you missions. You do not have to interact with any human players to play in WoW as there are missions across the world appropriate for different levels. Different regions have monsters and creatures of different levels, so if you are starting out you do not face level 40 or even level 20 monsters until you are ready. Missions involve collecting artefacts, delivering messages, carrying out assassinations. Some of these revolve around the politics of Azeroth which is divided into two main camps: Alliance (humans, gnomes, dwarfs, night elves, draenei) and Horde (undead, orcs, trolls, blood elves and taurens) with camps and bases across the continents with no clear frontline. There are also racial battles such as between the tauren and the kolkars (centaurs). For me it is interesting to get involved in the politics of the place. There is also a nature vs. industry battle going on with the Venture Mining Co. despoiling areas of the plains and especially the forests and some missions are to try and stop them. In particular the taurens with their Amerindian culture, shamans and druids emphasise the environmental aspects.

One really winning element of the game is the landscape that your avatar can run or ride or drive around or fly over. You can adventure across every kind of setting from frozen wastes to forests of European or North American style to the veldt or badlands or desert or tropical islands. They are well realised. You can stand on a mountain top in the Barrens (very like the veldt of Africa) and watch the sunset. It rains, there are misty days and so on in different locations. The cities are incredibly imaginative too, ones I have visited (I play as Horde characters, it is the Goth influence) include in the ruins of a city, in sandy caves and atop mesas, connected by rope bridges. Each has its own culture and are really beautifully rendered with different districts for various traders and trainers to buy that vital sack to carry your loot or where you can learn to smelt mithril or whatever you need. It costs money and there is the standard D&D currency of copper, silver and gold, though simplified to 100:1 rise at each step.

In most regions there are also 'instances' which are like classic 'dungeons' from RPGs and these are for groups to attack and have numerous corridors and rooms to explore and there is, of course treasure and artefacts to loot as well as experience to gain. Now, these are very like the kind of missions done with the paper-and-dice RPGs. You need a balanced team with fighters, magic users, healers, etc. and generally you need to collaborate if you are going to survive. You can join a guild, a kind of club, with its own communications channels. There are trade and general channels. Guilds have insignia and a tabard. Guilds vary, some consist of friends from the real world, some of people from a particular country (especially the Dutch and Scandinavians who probably feel outnumbered by British players), some have a focus on fun and many have a focus on getting their members to rise through the levels very quickly. There is a technique called 'boosting' in which a very high level character (the current highest level is 80) leads the way and simply slaughters everything that moves leaving the lower-level characters to pick up the experience points and the treasure. I have participated in one of these, not knowing it was going to be like that (you often get invited to go on a mission especially if you character can 'tank', i.e. is a tough warrior or can heal) and it was really tedious, I might have risen in standing, but simply by being the equivalent of a refuse man tidying up after the carnage. High characters can be good to help you out, but when there is no challenge there is no fun. However, for many players getting high level characters is the prime goal. The fact that being able to buy software that gives you gold in the game for money in the real world, and allows you to easily complete missions shows how far people are obssessed with 'levelling', i.e. raising up their characters.

Finally having set the background I come to my main point. Of course where WoW goes beyond PC games is that even with the richness of the various NPCs, you get to play with real people from across your country and other countries. However, as has often been noted, for all the fantasy names and the avatars, in the online environment people in fact reveal their real selves and that is in part what is alarming. If I had got into Second Life then I might have had personal confirmation of this fact sooner, but it has been in WoW that I have found it out myself. I think this fact first came to popular attention with the PC game, 'Black & White' (2001) in which players played a god running various primitive settlements and it soon became apparent that however hard you tried to dissemble how the people ended up would reflect your personality. I do not know if this kind of thing is used in scientific analysis but it does seem to work.

It is interesting how, even when packaged in a fantasy setting, the real you comes out. My girlfriend notes that when I play with her online I always step into protect her or heal her character. I find she runs off in a random direction without telling me and I am uncertain what her intentions are. As in real life, her character will not be constrained. It is all via computer, but the behaviour mirrors who we are in reality. Interestingly one of the greatest controversies in WoW was in January 2006 around gay guilds for characters and Blizzard had to drop its condemnation of such guilds.

In WoW you have to remember that whilst players are drawn from all ages and both genders, the bulk of participants are as they were for the paper-and-dice RPGs of the 1980s, teenaged boys. There is nothing wrong with that. I had rather they played WoW that shoplift or take drugs. However, with them in predominance it tends to soil the collaborative nature that Blizzard, the producers of WoW want to foster. It is striking that when you log on you get a 'tip' about play, sometimes this is technical, but often the Blizzard include a homily such as 'a little kindness goes a long way' or 'if you talk to someone before trying to trade or invite them to join a group, they are more like to do so'. To explain, 'trading' and 'inviting' are technical functions rather than dialogue. However, the fact that such tips turn up so commonly reflects the terse, demanding nature of many players. Reflect on the teenaged male players. They probably lack self-esteem in the real world, put down by society, their teachers and parents, and yet in WoW they can be a level 80 Death Knight called Ikillyouall and ride around on a dinosaur with a huge sword. They can extol their knowledge and show up older players. They can bully without ever facing consequences, they are literally immortal. No wonder they are obsessed by raising themselves to the highest level as quickly as possible and set up guilds, often with very strict rules, to achieve this. When you meet a lower-level character the player will often quickly tell you 'this may only be level 20, but I have four level 80 characters already' to show you that they must know more than you.

This sort of behaviour has always occurred in RPGs, I remember back in the early 1980s running a 'dungeon' in D&D for friends' low-level characters, but of course one boy with a level 4 character happened to have befriended a gold dragon, only the most powerful monster in that version of the game. His name was Jason Comfort, ironic because he was one of the most unpleasant people I have ever met. Last I heard he was with the RAF; just so I know I get the right one as there seem to be scores of them out there with that name. He insisted on bringing it to an adventure which was for far lower level characters than that dragon and wiped out everything with ease. He loved lauding it over the other players and demanded they behave in a certain way, even forcing one to sacrifice himself. So, this behaviour has always been around. I suppose what makes it worse, whereas in the past a boy like Jason would be one among a few, now they can band together through the wonders of the internet and that seems to make them feel that their behaviour is vindicated and thus 'right'.

Where tensions reach the highest level is in the 'instances' as the rewards are so much higher and there are often items that cannot be secured or bought anywhere else. When the fighting has stopped and the bodies are looted, special items are rolled for among the players, through and automatic system. You can 'Pass' if it is an item you have no interest in, you can select 'Need' or 'Greed' which has a bearing on the outcome. As you can imagine this causes immense tension. Of course the bulk of items could be traded later between characters (there are also auction houses in the cities, funnily very much like eBay but in medieval setting), but for many players that is too late.

Many of the hard levellers do no professions so do not understand that to make a potion or a piece of armour often needs an exotic range of ingredients. I clicked 'Need' for a piece of moss agate, a semi-precious stone used in metalworking worth a few silver and it was as if I had gone round these players houses and insulted their mothers. I offered to pay anyone for any item they felt I had 'Need'ed wrongly. Something similar happened on another mission and I was accused of being a 'ninja' looter, because again I had put 'Need' for some armour. Being the only warrior in the party it seemed not unusual that I should ask for armour or weapons that only my character could use. However, clearly that broke etiquette, more of that in a moment. Being called a 'ninja' looter means you are condemned by other players who will not go on missions with you. I asked if they wanted me to leave the instance if they felt so badly and they said not to. In fact I had 'Pass'ed on the bulk of the items and of course dared not touch any others and I realised that had been the point. The players who moan the most and condemn others are the greediest taking everything they can even if they cannot use, simply to sell off later. In some cases a player who takes an item the party leader wants, sometimes by simply clicking the wrong key, finds themselves dumped out of the instance. I suppose it is unsurprising that when mixing with teenagers you find them squabbling like children.

A lot of the dialogue goes on in text speak. Players have to type in to speak to each other, so this is no surprise. OK is reduced to 'k' and ready to 'r'; 'omfg' is popular to get around the system's built in swearword detector. A lot of smilies are used, again no surprise. Capital letters seem unknown to many players and a lot of discussions can become a string of consonants. If you cannot keep up or use full sentences for clarity (often necessary when planning tactics) you are condemned. In fact there is very little tolerance of difference. Everyone is expected to know how these players see things and if they do not they are patronised. 'Noob' from newbie, i.e. a newcomer is another insult, that is hardly likely to endear players not having played for the past 3-4 years. There is similar intolerance for players who have slow internet connection or have characters that do not hare around. As a warrior with a full suit of chainmail armour of course my character moves slower than a rogue in leather or a mage in cloth robes. Yet you are expected to be constantly at the front. There is no thought of the differences and any reference to them is taken as an insult or you somehow trying to trick the other players.

Of course, at the end of the day, it is only a game and the items are just electrons, though many players treat it almost as if was reality. It is an environment of constant warfare, disease and brutality like any pseudo-medieval setting, so I suppose in such a context you would expect selfishness. However, what is more worrying, aside from reducing the enjoyment of newcomers, is how behaviour in the game reflects so badly on the behaviour of these numerous players in real life. It is clear that greed is dominant and that it is seen as far more legitimate than need. Furthermone anyone even questioning that greed let alone contesting it is seen as illegitimate and offensive in trying to curtail the person's taking of everything. There is no sense that there can be negotiation and trading even though the system has such easy facilities for these things. They seem to entirely miss the point that collaboration actually helps you win through better than a lot of clashing egos.

Boastfulness, an easy access to wealth and rewards without effort, intolerance of any difference, unwillingness to listen to explanation all seem to be the expected norm. Of course, they need my character, otherwise they would not bother trying to recruit me to help, but there is no sense at all of quid pro quo, I am to be their servant and if I question that, they eject me. Given the numerous tips about behaviour provided by Blizzard it is clear they would prefer collaborative activity; partly because they know if newcomers feel they are entering into a hostile environment they will leave, as happens to people bullied in Second Life, and for the company that damages their revenue.

Of course there are good people in WoW and if you look carefully you will find guilds that promote the fact that much of the fun comes from the participation rather than a hard nosed drive to reach level 80 in a fortnight. Doing so means you miss out on the interest of a very complex fantasy world to explore and the interactions that are possible in what is a game but concealed beneath that is actually a very vibrant social network. Yes, it is escapist, but that is nice. If you are downtrodden in the real world, it can be good to take out your frustration killing a giant scorpion. However, what I also think it reveals, if we did not already know it anyway from driving on the UK's roads, is that a lot of young men are terribly rude, greedy, hugely ambitious, intolerant and self-obsessed and I am not keen to be an elderly person in a country where these men will be in charge very soon.