Before delving into this subject, it is important to highlight the fact that there has for a while been a growth in an ecological and security-oriented vision of urban movement. In fact, certain cities such as Lille, Grenoble, Lorient and Rennes limit traffic to 30km/h. A desire to favour bicycles through the construction of cycle paths has also been noted, as has the development of exclusively pedestrianised roads. In a sense, many think that there is a lack of political courage in this domain, and they underline a form of generalised hypocrisy regarding public urban politics. However, it is interesting to note that traffic lights and driving at low speed ultimately lead to the emission of higher volumes of fine particles.
Traffic lights: a revolution in urban traffic flow
Traffic lights first saw the light of day in the USA during the First World War. They began arriving in France in 1920 thanks to Léon Foenquinos, and began being set up in 1923. This creation replaced the “informal processes” (James C. Scott) regulating the relationship between urban spaces and their users. Previously, this informal process was internalised in public psyche, and allowed users of urban space to take responsibility for themselves. The instatement of traffic lights as a method of avoiding accidents created a subconscious fear of breaking the rules on the part of the citizens. Thus, citing Scott once again, the “electronic legal order” became henceforth dominant in our society.
A critical look at the electronic order: the case of Drachten, Netherlands
Indeed, in the town of Drachten, a question was asked in 1999 which is fundamental in understanding the importance of the criticism of traffic lights. This question was, “what would happen if there was not this electronic order at the intersection, and thus drivers and pedestrians were forced to come to their own judgment?” The result of the consultation which followed this question demonstrates a desire, both of public authorities and of citizens, to remove traffic lights not only in the United States, but also in some European countries, as David Millward comments in his 2006 article entitled “Is this the end of the road for traffic lights?”. This method of rearranging urban circulation allowed for the development of self-responsibility within urban space and encouraged users’ independent judgment.
A less radical solution: Hans Monderman’s concept of “shared space”
Hans Monderman was a road traffic engineer whose idea was to put an end to traffic lights. He conceptualised the notion of “shared space” noticing, whenever a power cut took place, that the traffic was more fluid than normal. He thus attempted an experiment by removing the traffic lights from the most highly shared spaces – a roundabout, a cycle path, and a pedestrian street in Drachten, where around 22,000 cars normally pass through daily. It was found that with the withdrawal of traffic lights, the number of accidents in the following two years fell to just two, compared with thirty-six observed during the four preceding years. Equally, the traffic moved more smoothly, as drivers were more vigilant about giving way on the right or about a right of way, meaning that they were more attentive. It is notable also that traffic lights create frustration; often people would honk their horns just as the light changed from green. This frustration led automatically to aggression, anxiety and stress. The withdrawal of traffic lights produced the result of fewer traffic jams, as well as less aggression.
To solidify his theory, Monderman compared the situation to skaters who adapt their movements to other skaters. It may also be compared to users of skateparks, where a fluidity of movement and a considerate attitude between users exist automatically.
Thus, Monderman’s concept may be said to reflect an anarchic vision, as it depends upon individual intelligence, common sense and independence. These rules in general within urban space lead to situations such as accelerations or running red lights. This way of thinking critically allows us to take a step back and become aware of the individual within the urban network.
One could envisage the towns of the future adopting this model, or being still more radical and banning cars in town centres. However, it must be noted that this new philosophy of “what seems unsafe is most safe” remains subject to debate, but it is also praised by Dutch citizens who are in favour of a “verkeersbordvrij” (traffic-sign free) situation.
The origins of the legendary Nessiteras rhombopteryx (‘Ness inhabitant with diamond-shaped fin’ in Greek), so named by the naturalist Sir Peter Scott, hark back to the 6th century BC.
The famous monster first appeared during the year 565 BC in the giant Loch Ness, a freshwater lake in the Scottish Highlands.
It was said to be an Irish monk on an evangelisation mission that saw the monster for the first time. According to the legend, he managed to send the monster away simply by making the sign of the cross.
Clearly terrified by the monk, the monster didn’t make a reappearance until 1369 years later – an apparition immortalised by the London doctor Robert Wilson and then relayed in the English press in the Daily Mail. But it was an elaborate hoax – the monster which appeared in the doctor’s photo was nothing more than a toy…
In the decades which followed, despite Wilson’s confession and the discovery of the hoax, the rush of testimonies continued and didn’t show any sign of stopping!
The world believed it: the Loch Ness monster existed, the evidence was there.
We decided to investigate ourselves, on the ground, for Ap. D Connaissances…
Stretching to over 200m deep, the Loch Ness is intimidating. And very cold. A monster could definitely hide in these troubled waters.
When we arrived at the shore, at the beginning of our investigation, a strange coloured caravan attracted our attention; a sign underneath read ‘NessieHunter.com’.
Steve Feltham was its proud owner. Passionate about the monster since childhood, he had abandoned his routine and peaceful family life in 1991, in favour of heading for the shores of the loch.
He has not moved since. He observes and scrutinises the lake with powerful binoculars, waiting for ‘Nessie’. Unfortunately, we didn’t have the chance to approach him during our visit in July; he was too busy filming a documentary with a television crew in his caravan.
To make his living, he makes little figurines of the monster with modelling clay and sells them in their droves to tourists who, each year, gather at the 56.4km2 lake.
All around in fact, we came across Bed & Breakfasts, souvenir shops, exhibitions… in the shops, Nessie is for sale as soft toys of all sizes, keyrings, mugs, hats, t-shirts…
The area around Loch Ness is, of course, very touristy. Boat tours even offer holiday-makers the opportunity to survey the lake with underwater cameras, for a few dozen pounds!
But this did not compromise the wild and magnificent setting, which was conducive to us seeing some little-known monsters in the middle of the calm waters.
Fulfilling our role as journalists-cum-investigators-cum-tourists, we made our way for the Loch Ness Exhibition Centre, in the unpronounceable town of Drumnadrochit, to continue our research.
The exhibition rooms analysed the myth of the Loch Ness monster with several films and animations, and the numerous testimonies were deconstructed. The rare images of this fabulous animal often have an explanation…
What if, in fact, the monster was nothing more than a piece of wood, or perhaps a seal? And what if the waves that it created at the surface were what in fact resulted in the movement of boats? And what if the mysterious bubbles that were sighted were just natural gas produced by decomposing shrubbery?
It is important to note that many research projects were devised by scientists, to try to distinguish fact from fiction.
On five occasions between 1954 and 1972, surveys were completed using sonar, a detection instrument which uses soundwaves for tracking, localisation and identification of underwater objects.
Eight of these surveys were deemed ‘positive’ – a strange echo had indeed come from the mysterious lake.
Upon leaving the exhibition centre, the natural history of the Loch and its depths were no longer secret to us. We decided, reader, to conclude the investigation by affirming that the Loch Ness monster exists. She’s just shy.
So, is this a mysterious Jurassic plesiosaurus, a giant snake, a huge sturgeon? It’s a mystery of crypto-zoology, and it’s up to you whether you believe in Nessie or not.
And if this short report leaves you curious, take a trip to the Highlands yourself.
Because even without its famous monster, Scotland is home to many more riches, and is the home of several other legends…
‘Nessie has enabled me to live this glorious life and I am living proof that it is possible to follow your dream. […] It’s not the length of life that is important – it’s the width of life, the amount of adventures that you can squeeze into any given year. That’s what matters.’ Steve Feltham, in the magazine Scottish Field.
The term Big Society was coined by Steve Hilton, the director of strategy for David Cameron. It was one of the core policies in the 2010 Conservative Manifesto aiming to create an active civic community through a massive devaluation of the central power to a more local level. The three main objectives of the Big Society were more social action, public service reforms and community empowerment. They framed Cameron’s response to Thatcher’s laissez-faire and to Labour’s own failures on social policy. Citizens would be empowered to take responsibility for their own lives and were expected to help individuals unequivocally. The former Prime Minister identified the social evils he intended to thwart through solidarity and social progress: poverty, inequality and injustice. This article will first present the Big Society project before tackling the related issues to this audacious proposal, leading to its slow demise.
The Big Plan
Even if such altruism coming from the Conservatives seems at first startling, it turns out to echo a traditional Conservative tenet: the civil society. The moral duty notion had already been advocated by Edmund Burke (1729-1797) in his ‘little platoons’ rhetoric. To him, family, the church, and local community are the foundations and the stabilizing forces of a healthy and strong society. Moreover, the concept is also affiliated to the One Nation Conservative ideology, which promotes the theory of an organic society revolving around local institutions (church, libraries) that glue the whole nation together.
The Big Society was to be carried out through a series of actions led by the Office for Civil Society (former Office for the Third Sector). They strove to reinforce civic responsibilities and to foster an active sense of community endowed with a solid culture of volunteering and charities. Community groups would have taken charge of parks, post offices, libraries or even local transport services. David Cameron planned to set up a Community First programme to stimulate social action within small communities and give them a more engaged role in society. They also promoted the creation of a National Citizen Service (for sixteen- to seventeen-year-olds over summer) done on the basis of moral obligation. The Big Society Bank, funded with dormant bank accounts, meant to bring more support for social entrepreneurs, as well as financing charities and voluntary groups. The mutualisation of public services was at the core of the project. Public service reforms would have given charities, social enterprises and private companies the right to compete in delivering the best services. One of Cameron’s main objectives was to delegate welfare provision to local groups. Thus, individual responsibilities would be complemented by associations rather than by the state. The devaluation of power bestowed upon neighbourhoods would have given a sense of ownership. The former Prime Minister insisted on the importance of the City Council’s role within communities. The social agenda included the set up of the Community Organisers programme, designed to help 5,000 community workers. A reward points plan was also on the table, to encourage a stronger volunteering spirit among communities. In exchange for good deeds (litter-picking, holding tea parties for isolated pensioners), citizens would have been entitled to discounts redeemable in shops and restaurants. Some welfare rights (access to council housing) would also have become conditional on people demonstrating their active citizenship. This whole scheme echoes the Conservative ideology to make people bear responsibilities (they should run local institutions themselves) instead of relying on the state. Hence, citizens would have been accountable for the improvement of the services they run, yet it would also have implied they should be able to address the future issues on their own.
Once David Cameron was in power, at the head of a coalition government (made of the Conservative and Liberal-Democrat parties), some initiatives were launched in accordance with his Big Society’s tenets. First, The Localism Bill was passed in 2011. The act facilitated the decentralisation of decision-making powers from central government control to individuals and communities, giving them the opportunity to initiate more action. Local authorities were also given greater autonomy. In addition, the act contained some planning systems at local or neighbourhood level. (The Localism Bill introduced the Neighbourhood Development Plans (NDPs), Neighbourhood Development Orders (NDOs) and, one specific form of NDO, the Community Right to Build Order, to influence housing development.) Then the Open Public Service white paper was published in 2011. It draws upon the idea that users of public services could legally choose their provider; it therefore expanded the range of choice they were offered. David Cameron meant to encourage the development of the third sector, in which private ‘social’ companies and charities could compete with the public sector to offer their services to users. Under the coalition agreement, the setting up of free schools run by parents, teachers, charities and education experts in 2010 was a meaningful shift towards Cameron’s model. Nonetheless, the free school’s autonomy was limited due to their state-funded system.
The Big Society was an asset part of an ambitious modernisation agenda. Cameron planned to move away from the ‘nasty party’ image inherited from Margaret Thatcher. The Big Society would, by definition, go against her view that “there is no such a thing as society”, a firm stance she famously shared in an interview in Women’s Only in 1987. In opposition to this, Cameron sharply declared, “there is such a thing as society, it’s just not the same as the state,” in a speech as the Conservative party leader, in December 2005. By purposefully crushing this burdensome legacy, Cameron hoped to appeal to more voters in the name of the compassionate Conservatism’s return. The former Prime Minister intended to use social reform to put his stamp on the Conservative party and to be “as radical in social reform as Mrs Thatcher was in economic reform […] to mend the broken society”, as he claimed in 2008. The Broken Britain he refers to is a theory characterised by a tax and benefits system entailing dependency and worklessness. Cameron attempted to castigate the Blair-Brown years, holding them responsible for the financial debt, a too powerful big state which destroys personal responsibility. To encourage a trust-based relationship between people and this fresh, progressive and earnest Conservative party, the former Prime Minister committed himself (and his party) to regularly publish government data. Hence, such accountability and transparency would reassure people and let them think they could constantly have a say in politics.
Despite all his effort and dedication, Cameron’s initiative remained unsuccessful and was never fully implemented in the course of his prime ministership. He had to admit his failure and to give up on his project for several reasons: ones that he stubbornly seemed to have turned a blind eye to (lack of enthusiasm from citizens), and ones that befell him due to poor strategy (economic hardship demanding drastic austerity measures).
The Big Mistake
The decline of the Big Society was due to several factors, both internal and external, that crushed the former Prime Minister’s hopes. One could criticize the brazen paternalism of assuming with confidence what is fundamentally better for citizens, even though they were not particularly keen to follow a lead which required more clarity. It was rather bold of Cameron to persist in convincing the electorate that his proposal would actually benefit them all. The Big Society did not work because the individuals were not willing to take a more active part in the civic communities. According to the IPSOS Mori poll in 2009, only one in twenty of the public wanted “involvement” in the provision of local services, whereas one in four merely wanted “more of a say” and half just wanted “more information”. Had the Prime Minister overlooked what people actually demanded and over-estimated their political participation? He might have advocated too passionately for his idea, which raises suspicion about his genuine motives. Would his proposal ultimately only serve his (and his party’s) interests ?
Equally, the concept may have failed because it would have entrenched inequalities instead of closing the gap. When asked who would be the most likely to take up more responsibility in a neighborhood applying the Big Society principles, the most politically involved individuals were often the better off. The most willing citizens to volunteer in their free-time are either those who are already well educated or skilled, those belonging to the majority ethnic group, or the retired. Participation (civic or voluntary work) is then determined through socio-economic factors and through social capital. The risk that minorities’ interests would not have been equally represented and defended in decision-making was remarkably high. What then were the solutions designed to fight social exclusion at local level ?
The Big Society was also massively criticized for its lack of coherence: how could local communities be empowered to run local institutions and take a more active part in politics if their funds were being cut? It was quite contradictory to push for more voluntary and charity action while the coalition government carried out their austerity measures. They enforced drastic public spending cuts in the third sector to cope with the economic hardship the country was facing. Yet, many charities and even social enterprises relied on grants and contracts from local governments. Consequently, this third sector funding shift would have highlighted the tremendous role the state played in supporting local authorities, public sector and charities. Hence, it would have been onerous and strenuous to compensate for the financial aid the state usually provides them. For instance, an approximate loss of between £3.2 and £5.1 billion in public funding for charities had been estimated. One major example of the poor viability of the project in the long run was illustrated by the withdrawal of the Liverpool City Council from the Big Society vanguard project in 2011. They justified their U-turn with the £141 million save they made in their budget. At first, Cameron was so keen to promote his scheme to local people, organisations and social entrepreneurs that he launched a town-hall meeting programme across the country. However, the very first meeting turned out to be a failure and the other meetings were immediately cancelled afterwards: the audience there was angrily protesting over cuts in voluntary and public sectors. The growing indignation and resentment could not be ignored anymore. Consequently, some of Cameron’s critics claimed that he only upheld the Big Society to make up for the massive public spending cuts he intended to pursue once in government. But this point is up for debate, since David Cameron introduced his idea for the first time in 2008, before the financial recession serious hit which called for dire austerity measures. Furthermore, the former Prime Minister directly addressed those critics in his party conference speech in Birmingham in 2010: “The Big Society is not about creating cover for cuts”, “I was going on about it years before the cuts.”.
Another reason for the failure of the Big Society could be the side effects of the public services marketisation it would have entailed. How could charities possibly compete with the private sector to offer the best services to users ? Did they need to endorse a more commercial and business driven strategy? It would have been pretty hard to keep up the pace when their funds would have been severely cut. Is there an underlying desire for privatisation behind these seemingly socially cohesive economic policies? The creed “Big Society, not Big Government” explicitly implies that fewer state regulations would have been carried out. The motto illustrates the Conservative and Liberal Democrats’ principles of securing a deregulated free market with less state interference and in which individual freedom prevails over a strong government.
The proposal was eventually buried down because of the inconsistency it raised. For instance, to what extent would the communities be empowered with autonomy? The prospect of local groups challenging the power of banks or campaigning against government policies was undesirable, and would go against the vision of the government working hand-in-hand with local communities. How would they cope with citizens’ desires and initiatives when they would clash with the government’s own? Another example of paradox lies in politics. How could a system claim to favour fairness if it deliberately turned a blind eye on its macroeconomic causes? Same goes for its direct democracy promise, which appeared very appealing, but turned out to lack serious constitutional reform. Mutualism and co-operatives were praised and encouraged but market transactions remained dominant. Moreover, a compassionate Conservatism was advertised in speeches, prioritizing the fight against poverty, inequalities and injustice, yet the solutions remain fundamentally neo-liberal. In reality, Cameron mainly sought to shrink the State to both maximize the economic mechanisms of a market economy and enhance individual freedom and responsibilities.
Did the project ultimately fail because of the period (not viable for the 2010-2015 Britain?), or because of the massive lack of support (both from politicians and civilians) and finance resources (austerity measures which cut the third sector budget, and hence jeopardised any possible successful autonomy from the state)? Or was it doomed from the beginning due to its inconsistency and to individuals’ distrust – how legitimate and genuine could an alternative be if it comes from the establishment itself? Voters wondered whether this plan was honest in its goal to improve citizens’ living conditions, or whether it was just another smokescreen to fool the electorate and convince them to vote for the Conservatives. People may also have been deterred from this plan because it was not radical enough; it appeared like nothing more than buzzwords, built on rusty old recycled ideas from the establishment.
5th July 1962: Algeria rejoices, independence is proclaimed. After eight years of war, and between 300,000 and 600,000 deaths – Algeria regained its freedom. But the independence process had begun a little earlier: the Évian Accords, signed on 16th March 1962, signalled a cease-fire (which would come into effect in Algeria on 19th March). The Accords would be approved by 90.7% of voters on 8th April, in a referendum in metropolitan France, and by 99.72% of voters on 1st July in an Algerian independence referendum. Algerian independence was accepted unanimously by both sides. But while the Algerian crowds delighted in crying “Long live independent Algeria”, while everything pointed towards a common happiness, a new issue arose: creating an Algerian state.
A new war, new opposition
And this is no mean feat, because in reality the war carried on. Two opposing sides vied for power in the newly independent Algeria: the Provisional Government of the Algerian Republic, and Ahmed Ben Bella, backed by the Houari Boumédiène’s Algerian People’s People’s National Armed Forces. The provisional government, created during the war in 1958, had a more official and legitimate status; having organised and participated in the Évian Accords, in the name of the Algerian people, it had become the official voice of Algerian independence on the diplomatic stage. But faced with this provisional government, Ahmed Ben Bella decided, on 22nd July, to create a political office in Tlemcen. Support for Ben Bella came from the Oujda Group, a group that had come out of the National Liberation Front. Ben Bella would then ally himself with the People’s National Armed Forces (which gave him direct influence over the National Liberation Front), and take on revolutionary tendencies because he became, in some ways, the (indirect) representative of the National Liberation Front, and a symbol of independence and revolution. Confrontations and political uncertainties were to follow, during which Algeria had no stable political power – this was the crisis of the summer of 1962. Beyond military opposition to power, these two sides were ideological opposites; the provisional government wanted to impose conservative political power which was far from the revolutionary socialist ideal that Ahmed Ben Bella had.
This crisis came to a conclusion on 9th September with the entry of the People’s National Armed Forces into the capital Algiers, led by Boumédiène and encouraged by Ben Bella. The forces of the National Liberation Front took advantage of this by organising constituent assembly elections on 20th September, but only candidates from the National Liberation Front were put forward; they thus won all the seats. On the same day, a referendum was organised in order to legitimise the role of the Constituent Assembly, and the “yes” voters saw a victory of 99%. The People’s Democratic Republic of Algeria was therefore proclaimed on 25th September. Ahmed Ben Bella became President of the Constituent Assembly two days later, supported by 88% of the Constituent Assembly, before becoming the first head of an independent Algerian state.
New state, new constitution
A little over a year after independence was proclaimed, on 28th August 1963, the Algerian Constituent Assembly adopted a new constitution proposed in person by Ahmed Ben Bella. This proposition was subject to disagreement and misunderstanding within the National Liberation Front, provoking a schism in the party. But this didn’t in any way prevent the Assembly from approving the new constitution, 139 votes to 23. The Algerian people also largely approved of the idea of a new constitution. The referendum of 8th September 1963 – almost a year after the referendum for the Constituent Assembly – saw the approval of the new constitution by almost 98% of Algerian voters. On 15th September 1963, Ahmed Ben Bella became the President of the People’s Democratic Republic of Algeria, with 99% of votes in an election where he was the only candidate. This was the beginning of a single-party regime, led by the National Liberation Front, where political opposition was forbidden. It is for this reason that parties like the Socialist Forces Front and the Party of the Socialist Revolution could exist only underground. The last elections of Ben Bella’s era were the legislative elections of 20th September 1964 in which, like in 1962, only candidates from his party could put themselves forward; they therefore won all the seats.
A failing democracy
In a little more than two years, Algeria had had three referendums and three elections. Often considered in themselves expressions of democracy, the referendums and elections here demonstrated a worrying ambivalence. Indeed, the regime of Ahmed Ben Bella had gone back and forth between elections and referendums, and authoritarian tendencies, because (with the exception of the independence referendum of 1st July 1962), most of the elections seemed false because of landslide results. Even though the “yes” votes in the independence referendum had been 99.72% of votes (a very high result), this was not very surprising given the revolutionary sentiment in Algeria, the fact that in metropolitan France, “yes” votes represented 90.7% of results (another very high result), and the fact that the referendums had been organised and approved by France. However, following this, all organised elections saw the absence of any opposition to the National Liberation Front, and brought in results way above 90%.
An ideological reconciliation
However, what we must not forget, is that this period of decolonisation was taking place in the middle of the Cold War. Therefore, the not only revolutionary, but socialist, aspect of the Algerian independence movement was not incidental. Most colonies belonged to European countries. In Africa, colonies were largely French and British – France and Britain having a large influence in the Western Bloc. The colonies were thus in themselves enemies of the USSR, and it is for this reason that the Soviet Union would encourage the desire for independence through advocating anti-capitalism and anti-imperialism. Thus, many conflicts and negotiations were organised by different communist or socialist parties, which would, in coming to power, establish a socialist regime close to the Soviet model. This was the case in Algeria, but also in Southeast Asia, Angola and the Democratic Republic of the Congo. The USSR completed a double whammy, destroying western influence in Africa and Asia, while also benefiting by expanding its own influence in these regions, as well as that of its ideology.
The USSR extended its influence in Algeria, notably with the People’s National Armed Forces and Colonel Boumédiène, by supporting them materially and economically. The People’s National Armed Forces thus became a considerable military force, and a danger. Let us not forget that Boumédiène is the man who allowed Ben Bella to come to power, as he was able to militarily oppose the provisional government during the crisis of summer 1962. He wished therefore to take the place of the President of the Republic. Ahmed Ben Bella, sensing approaching danger, decided to dismiss the ministers in charge of the most important ministries (Interior Ministry, Education Ministry, Information Ministry). These ministers belonged the Oujda group which was led by none other than Houari Boumédiène. On 19th June 1965, he organised a coup d’état against his former ally, overthrew him, and became the new de facto leader of Algeria.
Thus, Algerian Independence came about following a bloody war, but also in the wake of a crisis and internal conflicts. The reconstruction of the country in the years which followed was completed during political difficulties and instability. In reality, the semblance of democracy demonstrated the difficulty in putting one in place and constructing a state at the same time. This democracy became a failure from the moment when it became illusory; the independence was had been voted for by the people, but the subsequent reconstruction had in fact been undertaken without their real input. Therein lies the difficulty of reconstructing a country while constructing a state.
During the era characterised by USSR and US dualism, soft power had an important place in the conflict. In fact, ideological combat was just as essential as military operations. Stalin’s government, and Khrushchev’s government especially, understood this well. Thus, football experienced a rise in popularity during the 1950s and 1960s, thanks to the efforts of sporting heroes like Lev Yashin, Igor Netto and Valentin Ivanov. A journey back in time allows us to retrace this epic story.
Countless penalties were stopped by the legendary goalkeeper Lev Yashin, still to this day the only goalkeeper to win the prestigious Ballon d’Or, in 1963. He was awarded the title of Hero of Socialist Labour in 1990, testament to his influence.
Sport as an emblem of the communist model
As with the industrial sector, it was for the Soviet Union a question of equalling or even surpassing capitalist sporting powers, as sporting success could be linked to the means employed by nations to train their sporting elite to stand out at international level. For example, the training process for a high-level athlete proved to be particularly costly in terms of infrastructure, research framework, etc.
Additionally, sport had a large audience which needed to be won over. This reasoning applies even more to football, given its long-held position as the most popular sport in the world.
Press articles and biographies allow us to see this process of framing athletes as heroes. Sylvain Dufraisse, postdoctoral candidate at the University of Nantes, evokes the ‘internationalisation of sport.’ This term demonstrates an intention to push athletes to the top level, maximising their potential through modernised training methods and an increased consideration of athletes’ physical and mental wellbeing. To encourage the athletes to push their limits, several reward systems were established alongside these methods, including the title of Master of Sport.
This paradigm signals a major turning point in Soviet thinking. Indeed, competition had thus far been seen as a bourgeois construction which advocated individualism. However, the Cold War marked a break with this attitude and promoted a perspective of confrontation with other countries.
1956-1958: the first promising steps onto the international stage
The Association Football Tournament at the 1956 Olympic Games in Melbourne was an important outing for the Soviet football team. Although at the time reserved for amateurs, this made way for the beginnings of the professionalisation of football. Therefore, the USSR travelled to Australian soil without particular expectations, preferring to concentrate on the upcoming World Cup.
However, the team racked up victories, finally facing Yugoslavia in the final (whose President Tito had strained relations with Stalin until Stalin’s death in 1953). This 100% communist final saw the USSR come out victorious, by the smallest of margins with a score of 1-0. The team of Anatoli Ilyin, Lev Yashin and Igor Netto walked away with gold medals around their necks, and fuelled the hopes of an entire nation for the World Cup…
However, before they could hope for a successful run, Gavriil Katchaline’s team would have to overcome the obstacle of the Qualifiers.
With one game to go, they were in the lead, with three points between them and the Polish team in their group. However, a defeat in this match forced them into a deciding round, as their points had levelled. Driven by more than 100,000 supporters, Poland defeated the USSR, meaning they had to win the play-off to progress. After a suspenseful match, the USSR won 2-0 and secured their ticket to the 1958 World Cup.
The USSR managed to reach the quarterfinals despite a rollercoaster of matches, including a victory against Austria, a defeat against the untouchable Brazilian team, and a draw against England followed by a win in a play-off match.
The host country Sweden then stood in their way. Before the match, they were already faced with some disadvantages – they had only had one day of rest between matches, while the Swedes had had four! Additionally, they learned that their flight to Stockholm had been cancelled, which led to a long bus journey which impacted their performance greatly. Sweden, motivated by a huge crowd, won 2-0. The USSR went home and was battered by the local press, in spite of an encouraging result for their first time in the tournament.
This painful experience allowed the team to gain experience and understanding. It remained to be seen whether they would have the mental capacity to perform at the European Nations’ Cup in 1960.
1960: The USSR at the top of Europe
In this year, the Soviet team wrote the best chapter of their history. However, not everything was in good shape when it came to entering the European Nations’ Cup in France. The team found themselves deprived of one of their star players, Eduard Stretsolv, when he was deported to a gulag after being accused of rape; although evidence was inconclusive, he pleaded guilty to the accusation. Losing one of the team’s most talented players took a toll on everyone’s morale. Nonetheless, the will to keep progressing was deep in the team’s psyche, and kept the players focused. In addition, new manager Nikolai Boulganine was able to refresh the team, while simultaneously preserving its foundations.
The qualifiers were hardly more than a formality; the USSR played only two matches. The first was against Hungary, who had lost their Ballon d’Or player Ferenc Puskas to exile in Spain. The Soviet team played well and won comfortably 3-1. The task was more difficult in the second leg, but the result was the same: victory for the Soviet team.
All that remained now was to overcome the obstacle of the Spanish team; however, Franco categorically refused to send players from his country to the USSR. Thus, the Soviet team succeeded in qualifying by forfeit. Things were starting to get serious. This time, the desire to win was clearly visible. It was no longer a question of hiding behind an ‘outsider’ status, but to aim to get a place on the podium for the first time.
The European Nations’ Cup, the first European Championship, took place in France as a tribute to Henry Delaunay, creator of the competition who died soon before it. The choice of location was not at all unanimous, and took on political dimensions when West Germany, England and Italy decided to boycott the event. Thus, the level of play was less high, and only 17 teams took part, increasing the probability of surprises.
The 18 teams nevertheless took up their positions. It was in a final four composed of France, the USSR, Czechoslovakia and Yugoslavia that Lev Yashin and his teammates got the measure of what they were playing for.
The USSR gave quite a performance against Czechoslovakia: a crushing 3-0 victory and an opportunity to progress even further to play the Yugoslavs, for a remake of the 1956 Olympic final.
It was at the Parc des Princes, watched by 18,000 fans, that the last part of this tale took place.
The tension was palpable; the two teams knew each other well and are apprehensive. This nervousness showed itself on the field, with many instances of contact breaking up the game. Yugoslavia opens the scoring just before half time, which was a blow to the morale of the opposition. However, the second attempt by Lev Yashin’s team finished happily for them, who equalised through Metreveli in the 49th minute. The match finished 1-1 and went to extra time. The fate of the match was hanging in the balance. The prospect of a draw after extra play added to the tension of the legendary match.
However, it wasn’t over yet. That’s why, 7 minutes before the end of extra time, Victor Ponedelnik scuppered Yugoslav hopes and scored the winning goal. As the only player on the team to come from the Russian second division, he catapulted the team to national heroes.
The heroes returned almost immediately to Moscow to be with the people eager to celebrate their achievement. The win had a huge resonance, evidenced by the crowd which gathered at the Central Lenin Stadium.
Several more campaigns followed their victory, but these were marked by early World Cup eliminations or inability to qualify for the European Championships. This was proof that their height was only ephemeral, but it earned the team places in the football and Soviet sporting halls of fame.
Lev Yashin, a legendary goalkeeper ahead of his time
Lev Yashin had extraordinary abilities and his name will always be among those of the greatest players of all time. He was named Best Goalkeeper of the Century both in Europe and globally by the IFFHS. The only keeper to win the Ballon d’Or in 1963 and nicknamed the ‘black spider’ because of his agility and black football attire, Yashin posed a problem for any attacking player. His figures speak for themselves: specialists have recorded his playing over 270 matches without conceding goals, as well as 150 penalties stopped. These almost unreal statistics seem even more impressive when you are aware of the philosophy of play of the time; the teams put all the pressure on the attackers and played with only 1 or 2 defenders.
However, his career was worth more than a collection of individual accomplishments; the Soviety keeper radically changed the style of play of goalkeepers during the 1950s and 1960s. Rather than clinging to the goal-line while passively waiting for opposing offensives, Yashin did not hesitate to go out several metres from the box to block the angle of attack (not unlike Manuel Neuer today). He was also one of the goalkeepers to begin punching out balls when they would be difficult to catch. This revolution in goalkeeping style portrayed a man ready to go against convention, and with charisma that would be the envy of many Hollywood actors.
Yashin was not entirely unlike the others; for example, he declared that his pre-match ritual was to smoke cigarettes – ‘to relax the muscles’ – and drink a glass of vodka – ‘to invigorate them.’ Quickly, the communist regime understood that censoring Yashin was pointless, and it would be better to use him as an example of Soviet prestige. In fact, his difficult childhood in the working-class area of Moscow made him credible and identifiable for crowds. Born to a working family, he became an apprentice locksmith as a teenager during the war. At the age of sixteen, in 1945, he was rewarded with a medal for ‘his valiant work during the war.’ The USSR seemed to have found an almost perfect hero.
Like any hero, he used his charisma to unite his teammates around him. Even when things were going badly for the team, his decisive performances continued to be a motivating factor, even after a head injury at the 1962 World Cup in Chile.
At the age of fifty, his health began to worsen, and he contracted gangrene in his leg. This painful experience marked the beginning of a long decline in his health, which included an amputation. He died of stomach cancer on 20th March 1990.
Thus, this monument of football and sport will have inspired millions through his achievements and his singularity. Hats off to him – heroes and legends are timeless and eternal.
The streaming platform Netflix has established itself as an unavoidable giant in the world of cinema, and has a strategy to counter the arrival of Hollywood studios in the increasingly contested video streaming market.
Netflix has played its cards right in the world of the Seventh Art. The world number one streaming platform invested 175 million dollars in The Irishman, the most recent film by Martin Scorsese. By way of comparison, in 2015, the blockbuster Jurassic World had cost Universal Pictures 150 million dollars. By choosing three heavyweights of the American box office – Robert De Niro, Al Pacino and Joe Pesci – Netflix confirms, with no ambiguity, its desire to compete with the large American film production studios, which had otherwise been immovable in the industry.
The platform has already competed with some big names. During the 91st Oscars ceremony, it pocketed three awards for Alfonso Cuaron’s art film Roma. The arrival of Netflix has irreversibly altered the traditional economic model of cinema, as the platform is able to sustain skyrocketing production budgets, thanks to recurring revenue from its 150 million subscribers.
However, the streaming site struggles to adapt to the European model, which is characterised by public financial support for creation of cinema.
“The goal of Netflix is to make money, not art. As a production house, it signals the death of local economy. The revenue from a film is no longer shared between producers and distributors,” comments Raluca Calin, a sociologist specialising in European cinema.
Especially in the Hollywood market, the strategy of Netflix is to nuance their output. It has not yet reached the lavish budget possibilities of Hollywood films; in 2018, Avengers: Infinity War was estimated to cost between 320 and 400 million dollars.
Attracting the great directors
Netflix owes much of its success to its original series such as Black Mirror, Stranger Things and La Casa de Papel. In three years, its budget for these productions has tripled, rising from 5 billion dollars in 2016 to 15 million dollars in 2020. Rivals Disney, Warner Media and NBC Universal are determined to make their own impact by breaking into the video streaming market in recent weeks. From Netflix’s perspective, want to play Hollywood at their own game by producing blockbuster films; they have already attracted major directors like Martin Scorsese, Steven Soderbergh and Guillermo del Toro.
This is a vital move for the platform which will soon lose its rights of exploitation for some of the series which engendered its popularity. According to a study by data analysts 7 Park Data, 80% of programmes streamed by Netflix are not their own programmes. These shows have been produced by its competitors, and there is the continued threat of their withdrawal from the platform. This looming reduction of its catalogue is shown most notably in the departure of cult series such as Friends and The Office, which will return to NBC Universal in 2021, and Grey’s Anatomy, which belongs to Disney.
However, nature abhors a vacuum: Netflix will compensate for this loss of content with an upcoming blockbuster. Six Underground, approved by Michael Bay and allocated 150 million dollars of Netflix’s budget, will be online on 13th December.
Profoundly affected by Covid19, the UK struggled to provide an efficient response to curb the spread of the virus. The massive financial plan presented by the government heavily relies on welfare state provision – supporting the NHS or introducing benefits for employees and small businesses. However, the crisis did not endanger the welfare state system, it only exposed its deep-rooted flaws. Hence, I seized the opportunity given by the current context to go back in time and discuss the evolution of the welfare state through the 20th and early 21stcenturies to better grasp its origins and its challenges.
The welfare system
The UK welfare state is a ‘safety net’ which aims to provide support to ‘from the cradle to the grave.’ The three major themes of the welfare state revolve around education, housing and healthcare. Its provision to millions of British citizens is carried out through a set of programmes and grants schemes. Among them, benefits entitled to working parents, the unemployed and disabled people, housing benefits (to help pay rent or afford heating expenses) and basic national state pension. The welfare state was initially designed to protect and ensure a greater level of equality among its citizens, yet some argue that the system is too costly for the government and entails higher tax rates. They denounce welfare dependency, where benefits recipients might not have any incentives to find a job and therefore, it worsens the already high unemployment rate and low rates of productivity. Those critics advocate for an overhaul of the system. In fact, the UK welfare state initially tended to follow the Beveridge model, rather than the Bismarck one. The former system is characterised by flat-rate contributions and tax payments to finance public health insurance. Hence, the state budget represents a major role in its funding. The Bismarck one is based on an insurance system financed by both employers and employees contributions, based on the salaries. However, a gradual convergence of the two systems of social security can be observed in Europe.
The emergence of social security
After the Napoleonic wars, the country faced economic hardships and a high unemployment rate. The levels of poverty soared. Hence, provision of relief to the less fortunate came in the form of the 1834 Poor Law Amendment Act, which reformed the workhouses. Workhouses usually provided food and shelter for those in desperate need in exchange for tedious, repetitive and sometimes dangerous work. Oliver Twist, although a fictional novel, provides an accurate account of those unforgiving institutions. Nonetheless, their role was more extensive as they could also function as schools, hospitals, and asylums. The 1834 Act strengthened the principle of ‘less eligibility’, which ensured that ‘the condition of a pauper in the workhouse should be not as attractive as that of the poorest labourer outside the workhouse.’ In fact, the Act was designed to deter the able-bodied poor (who were socially regarded as responsible for their own situation due to their idleness), from depending on the workhouses and rather encourage them to seek a job instead. A distinction was established between the deserving poor – who deserved help, pity, and compassion, and the undeserving poor, perceived as mere burdens on society. Besides the paternalistic and degrading workhouses, which stripped their inmates of their individual freedom and autonomy, the end of the 19th century was characterised by the emergence of Friendly Societies and of Trade Unions. Those organisations, alongside private or volunteer charities, were an alternative source of support for the unemployed and the poor. In short, the late 19th century was a mixed welfare economy.
The foundations of social security welfare were laid in the early 20th century. A series of reports (conducted by Seebohm Rowntree in 1901 and Charles Booth in 1892-1897) exposed the plight of poverty and the increasing number of working-class people falling below the poverty line. As a consequence, the question of poverty became a pressing electoral issue. The Liberal Party, in government from 1906 to 1914, launched a series of laws to tackle the problem by targeting the three main vulnerable groups: the children, the elderly and workers. Free school meals were first established in 1906, and the Children’s Charter in 1908 condemned cruelty or neglect. The pension scheme for the elderly aged over 70 was passed in 1908. In 1911, the government drafted the momentous National Insurance Act, a contributory system providing workers with health insurance (echoing the Bismarck model). However,, WWI led to a hiatus in the desire to reform..
The influence of the Beveridge Report
Published in December 1942, the Beveridge Report or the ‘Report on Social Insurance & Allied Services’ identified the ‘five giant evils’ of society, which William Beveridge aspired to eradicate: Want, Disease, Squalor, Ignorance and Idleness. Hence, he advocated a national, compulsory, flat-rate comprehensive and universal insurance scheme covering healthcare, unemployment and retirement benefits. In fact, the Liberal politician denounced means-tested benefits, leading to the stigmatisation of the benefits recipients. Furthermore, he proposed to make children’s education free, develop council houses and strive to achieve full employment. Following the publication of his decisive report, major progressive legislative measures were enacted. The first one, the 1944 Education Act, made secondary education free for all and created the notion of selection at 11. Then, the National Insurance Act of 1946 established a comprehensive social security system to grant benefits ‘from the cradle to the grave.’ Two years later, the National Assistance Act extended the provision of benefits to those not covered by the previous Act. The second National Insurance Act of 1946 covered industrial injuries and introduced paid compensation to injured or disabled workers. Last but not least, the National Health Service Act of 1946 established the NHS, although was officially implemented in 1948 owing to firm reluctance from certain doctors.
William Beveridge (1879-1963), Economist and Liberal politician.
The welfare state during the post-war and Thatcher years
In order to rebuild the country after WWII, the Labour government of 1945-51 built 1.25 million council houses. They strove to replace old slum housing and to provide decent homes to those having lost their houses to bombs. The Town & Country Planning Act of 1947 set the target of building 300,000 new homes per year. In 1951, due to the increasing expenditure of the medical treatments, services and growth of the number of patients, the NHS resorted to introducing charges for dentures and glasses. Therefore, the free, comprehensive and universal ideal of Beveridge could not keep up with the increasing expenses outweighing the allocated state budget (around 10% of GDP). The politician also assumed that the country could maintain full employment and consequently, when the UK faced harsh economic hardships in the 1970s with more than one million unemployed, the amount of benefits they could claim became a serious concern for the state budget. The stagnation (high inflation and economic stagnation) jeopardised the provision of the welfare state.
When Margaret Thatcher became Prime Minister in 1979, she pledged to roll back state intervention and, in accordance with her neo-liberal stance, develop privatisation and free market policies. Furthermore, she made unemployment and sickness benefits liable to tax. In short, Thatcher promoted the Victorian values of self-reliance and a degree of noblesse oblige, instead of entrenching the dependency or ‘nanny-state’ culture.
The Quasi-Marketisation of the Welfare
A set of market-based reforms was launched in the 1980s. The first overhauled the structures and agencies of the civil service. New strategies and models, usually applied in businesses, were implemented in the third sector. For instance, it resulted in the rise of various agencies like the NHS Trusts. The third sector was no longer only reserved to the public domain, and the private sector could now run hospitals, propose insurance schemes and other alternative services. However, these options are mainly reserved for those who can afford them and it thus widens the social class gap. This plan was meant to foster competition to ultimately improve the efficiency of the services delivered. This marketisation was carried out under the label of New Public Management. In his book Why We Need Welfare, the social policy specialist and professor Pete Alcock is overtly sceptical of these policies. He exposes the consequences of managerialism, which conferred more responsibilities to managers striving to reach the targets instead of focusing on professionals’ opinions. To Alcock, the neo-liberal principle of granting individuals a greater range of choice, labelled as the provider culture, is to tamper with the provision of a caring and collective welfare state – an ideal he heartily endorsed.
Another considerable shift can be observed through the public attitude towards welfare state recipients. The compassion and empathy they had once inspired following the publication of the series of reports, which were pivotal milestones in the construction of the welfare state, gave way to scorn, animosity and individualism. According to surveys, an increasing number of participants tend to hold the recipients responsible for their own poor situation in ‘meritocratic’ Britain, which overlooks the by-products of socio-economic inequalities and undermines the concept of social determinism. Furthermore, some even deem them burdens for the state, living off the welfare benefit schemes and draining collective efforts. However, it is important to highlight the underlying influence of media hammering on the benefit fraud cost, which ultimately contributes to the rise of acrimony.
The quasi-marketisation was also enforced in the education sector. The Office for Standards in Education (Ofsted) was created in 1992 to inspect schools and to publish ranking reports. Therefore, they gradually encouraged a degree of competition between institutions to guarantee the best academic results and educational content, in order to attract more pupils.
The welfare state today
Contentious reforms were enforced under the coalition government (composed of the Conservatives and the Liberal Democrats) in 2010-2015, altering and reshaping the welfare state provision in order to cut expenditure. Such revisions were carried out in accordance with their austerity agenda to recover from the 2008 financial crisis. They adopted a targeted approach of welfare provision which restricted the Child Benefit conditions in 2013, set a benefit cap and implemented the Bedroom Tax in 2012 (or the Reduction in the Spare Room Subsidy). By tightening means-tested provision, the stigma of benefits recipients worsened (accentuating the Us vs Them myth) and the rate of non-take-up of benefits also rose. Therefore, it demonstrates an overt distance with the Beveridge model which abborhed means-tests. Universal Credit was established in 2012 to replace the six means-tested benefits and tax credits (Income-based Jobseeker’s Allowance, Housing Benefit, Working Tax Credit, Child Tax Credit, Income-based Employment & Support Allowance and Income Support), and combine them into one monthly payment.
The government pursued the austerity agenda through the controversial rise of university tuition fees, whose cap tripled to reach £9,000 a year (and rose again in 2017 to £9250). The latter would have contributed to the marketisation in higher education, where universities compete by setting different tuition fee rates. Hence, the pressure to appeal to students would lead to massive enhancements of the quality, price and relevance of universities’ provision. However, this motive was a failure owing to the choice of universities to set their tuition fees at the cap price. They dreaded that fixing a lower price would suggest they were of second-class quality. The marketisation process was illustrated in 2014 with the implementation of The Research Excellence Framework (REF), a method which assessed the quality of research in UK universities to inform and provide a transparent account to research funders.
The welfare state is a crucial safety net to millions of British citizens, and the demographic projections all warn about the ageing population. It ultimately implies that an increasing number of citizens will become a strain on tax revenues. In fact, their vulnerable health often requires heavy medical treatment, which eventually demands the most effective (and therefore most expensive) technology. Thus, the welfare state,competing for tax allocation with other sectors, might not sustain a growing number of elderly recipients nor the cost to fulfill their needs if the current Beveridge social insurance system is not reformed.
Will this Covid-19 crisis drive substantial long-term changes in the welfare state, in the form of more state intervention and reforming measures, thanks to a new awareness of hospitals’ lack of resources? Or will the changes be temporary, acting as a precursor to drastic budget cuts under economic pressure? It remains to be seen.
On the 5th May, at 10am, the Moroccan Health Minister announced one hundred new cases of Covid-19, bringing the number of confirmed infections to 5153. The kingdom mourns its 180 deaths while France grieves for its over 131,000 confirmed cases and 25,000 confirmed deaths recorded from the same date. What follows is Ap. D Connaissances’ exploration of Morocco’s model for managing this epidemic.
Morocco declared a state of public health emergency and enforced a lock-down at 6pm on Friday 20th March (five days after France), in order to contain the spread of Covid-19, and the country seems to stand out from others affected by Coronavirus – because Morocco has not visibly allowed itself to be overcome by the health crisis. And it makes an impression – on the international stage, but also on its own citizens. The country would have had time to prepare itself, to observe the response of its affected neighbours like France or, prior to that, Italy.
We should remember that Morocco is not a member of the European Union; closing its borders and controlling flights to and from the hardest-hit places is possible, or even easy (the country has fewer than ten international airports, and only one port that passengers can enter by). On this subject, the Moroccan news website Yabiladi (which draws its sources from the Moroccan Minister of Foreign Affairs) reported that on 3rd May, there were 387 deaths of Moroccans abroad following being ill with Coronavirus, in comparison with 174 recorded deaths in Morocco itself. Adjustments, strategies, and organisation: what are the factors that enabled this incredible resistance in the Kingdom of Morocco?
From an economic perspective, the country established an emergency fund of 10 billion Moroccan dirhams (€1 million/£830,000). The fund is financed by private businesses and individuals, but also by the king Mohammed VI himself. It is thanks in particular to this fund that the Kingdom was able to offer inexpensive masks to its citizens. Wearing a mask has been obligatory since 7th April, and each day, nearly 7 million masks are produced in the country. Businesses have transformed their activities to meet national demand. Sold at very low prices, sometimes even less than 1 dirham (1 Euro cent/less than 1 penny), masks are readily available in local grocery shops, as well in supermarkets and pharmacies.
Morocco was also one of the first countries to test the effects of the antimalarial drug chloroquine, the usage of which is heavily debated in France, on confirmed coronavirus cases. In April, Morocco authorised the use of chloroquine as a coronavirus treatment, by combining it with an antibiotic. In France, Professor Raoult, well-known infectious disease specialist, is one of the few to advocate for the effectiveness of the treatment of Covid-19 patients with chloroquine and azithromycin, as soon as the first symptoms of the virus appear. Can the difference in mortality rate in Moroccan territory compared with other affected countries (without forgetting Moroccan ex-patriots) perhaps be explained by the types of treatments being used? Is this a worthwhile initiative, a miracle product, or danger in liquid form? Note that if chloroquine seems to be effective in some cases, its most significant side effect can be bradycardia (very slow heartbeat). One of our sources, working in intensive care in a hospital in France, explains that in Lille for example, chloroquine is used in certain situations, but has not yet been subject to clinical trials. Its effectiveness has not been scientifically proven.
Military hangars in Morocco have been transformed into field hospitals. Military hospitals have themselves been requisitioned. The army has prepared for an eventual massive influx of patients. A major reorganisation can be observed within Moroccan hospitals, but not only this: the anaesthetist Dr Chafik El Kettani, speaking to the news outlet LCI, said that once the patients are clinically recovered, they have the choice between staying in hospital until they are biologically recovered, or to go to one of the ‘buffer’ facilities. These places, principally hotels, have been established to manage the movement of patients and ensure their well-being as much as possible. These spaces are disinfected several times a day, and patients are attended to by doctors who guarantee the monitoring of their recovery.
Beyond the purely technical and health-related provision that these structures provide (the decongestion of hospitals particularly), they also represent a support zone for those who have been ill – a transition before their return home. From a social perspective, Morocco has taken the right course of action here too: since the beginning, a strict quarantine and curfew were imposed by the government. The images of (totally) empty streets in different Moroccan towns inspire a certain admiration for the cooperation of Moroccan citizens. An individual walking around without a valid certification is not simply fined, but risks being immediately remanded in custody.
The area of education has not been pushed aside in the management of the crisis: distance learning courses have been arranged. Inequalities between students have also been taken into account, because certain lessons are even aired on a public television channel.
The countryside seems to have been spared by the epidemic, but the seemingly exemplary management of this health crisis has served to accentuate other inequalities in the most affected Moroccan towns – in particular, the struggles of the most vulnerable people, a real problem in lower-income neighbourhoods. Charities have thus been busy distributing food parcels to homes. The government, meanwhile, is paying a benefit of around 1000 dirhams (€100/£80) to undeclared workers.
According to the prince and political commentator Prince Moulay Hicham, the citizens of Morocco can be grateful for the management of ‘the health crisis, the lockdown measures, the quarantine, the funds and resources.’ Of course, the Prince is perhaps not entirely objective, because he is the first cousin of the King. Even so, it must be recognised that Morocco seems effectively to have known how to face the health, economic and social challenges linked to the coronavirus pandemic, and the figures support this.
If Morocco now fears a summer without tourists, this North African country has managed to establish its know-how, inspiring the admiration of the great powers who prefer to dance in the streets, go kayaking, drink bleach* or wait for herd immunity.
Health Minister Khalid Ait Tahib has nevertheless warned Moroccans to remain vigilant, as he maintains that ‘the epidemic could double in intensity.’
Translated by Jenny Frost
*Drinking bleach is strongly discouraged by the editors of Ap. D Connaissances
In a patriarchal society in which inequalities between men and women persist, it is important to talk about pioneers of feminism who have fought for emancipation and for the rights of women. The talented Frida Kahlo, the Mexican painter, is one of those who made the cause of women’s rights a personal affair. She established herself as a true icon of feminism, desiring to defend “this silent and submissive mass”. Her intrepid character and striking and revolutionary works, as well as her nonconformism, have made her into the legend that she is today. The beautiful Mexican woman with a ruined body and broken heart, the “daughter of the Mexican revolution” as she liked to call herself, transformed the world from Mexico. Here is a look back at this strong and independent woman, and at her tragic life.
A dramatic existence: what doesn’t kill you makes you stronger…
Frida Kahlo claimed to be born in 1910, a date symbolic of the beginning of the Mexican revolution, but in reality she was born in 1907 on the outskirts of Mexico City. She spent her childhood and most of her life in the “Casa de Azul” (the blue house in Spanish), which is now a museum dedicated to her, and which keeps many of Kahlo’s works and personal objects hidden within its walls.
As a young child, Kahlo was forced to overcome complicated challenges. In fact, at the age of only six, she fell victim to polio, an infectious disease that can cause paralysis. Before long, her right leg began to deform and stop growing. These physical anomalies led her to be mocked by her classmates, who called her “crippled”. This period, although upsetting, helped her to grow in maturity, and she learned to turn this weakness into strength.
From an early age, the Mexican decided she did not want to follow the same path as the other women in her country; that is to say, one dictated by misogyny. Kahlo dreamed of travelling, independence and freedom. She wished to study and to come to know love, pleasure and happiness. At the age of sixteen, she was already interested in politics, and had no plans to embark on the same journey as her father Wilhelm Kahlo, a great painter and photographer of German origin. Art was still, at this time, totally foreign to her.
An accident: a turning point in her life…
Kahlo was young, beautiful, and had a promising future ahead of her. She would study hard and would become a woman. But at 18 years old came her coup de grâce, a turning point in her young life. She would take the bus to return from lessons; one day, the driver lost control of the engine and veered off the road, colliding with a tramway. The result was catastrophic: many people died in the accident. Kahlo did not die, but suffered serious injuries from being pierced by part of the bus. She had a perforated abdomen, a fractured leg, a broken foot; her pelvis, ribs and spine were all broken. In short, her body was mutilated. She was forced to remain in hospital for several months in a plaster cast. Her reproductive organs were also affected; she would learn years later that she could not have children.
Only her mental strength would help her to overcome this trauma and these immense physical and moral challenges. She decided to fight against herself, against her own body. For this, she took refuge in painting, depicting on canvas things as she saw them. Across her self-portraits, she focused on herself and made herself the true subject of her art. Painting helped her to get back on her feet and forget the suffering that consumed her.
1928 marked the beginning of her involvement in politics, when she joined the Mexican Communist Party. She wanted to change the order of things, overturn social codes and reduce inequalities; she wished to partake in the revolution to transform the world into one without classes, and where oppressed groups would live in better conditions. The “daughter of the revolution” launched herself into a fight for the rights of women and against the machismo which was so common in Mexico and everywhere else at this time. The man was considered dominant, while the women was reduced to the role of the housewife. Kahlo could not stand these inequalities, and refused to submit to these degrading stereotypes.
Diego Rivera: a relationship regulated by passion and by suffering.
Not long after this, she met Diego Rivera, a communist painter whose frescoes had brought him popularity (they can now be seen in San Francisco, for example). Kahlo was impressed by his work, and Rivera was captivated by this unique and talented woman. They shared the same love of art and for communism. It was love at first sight, and they decided to marry. Kahlo’s mother was unimpressed, calling this marriage “a union between a dove and an elephant”.
The pair were passionately in love, but before long, Rivera’s infidelity began to complicate their relationship. He had an affair with Kahlo’s sister, as well as other women. But these adulteries encouraged his wife to do the same: being bisexual, she had male and female lovers, such as the Russian revolutionary Leon Trotsky and the Parisian actress Josephine Baker. Despite these unhealthy goings-on, the lovers could not bring themselves to leave each other; if anything, their passion for each other became stronger. Indeed, after this difficult period they divorced in 1939, before remarrying a year later as they couldn’t bear to be apart any longer.
Kahlo’s inability to have children due to her accident added itself to a series of psychological pains that she had to face. Her repeated miscarriages had terrible mental health implications. She was desperate, and felt alone and abandoned by everyone. This period would become for her one of the most painful in her life, and it would take a long time for her to accept it and move forward.
The 1950s: the descent into hell.
From 1946 onwards her state of health, which had been stable, begun to get progressively worse. Her pain became unbearable, and she had to have multiple operations on her spine. The result was regrettable: she was forced to spend no less than nine months in her hospital bed. No longer being able to be in control of her own body put her in a difficult situation. However, she continued to paint, even assisting with her photographer friend Lola Alvarez Bravo’s exhibition from her hospital bed.
In 1953, she had another setback: her right leg had to be amputated due to gangrene. This operation put an end to her physical suffering, but her struggles felt more present than ever. The loss of her leg plunged her into a deep depression. She wrote in her journal that the amputation of her leg almost made her lose her mind. She had a permanent desire to kill herself, with only her husband preventing her doing so as she imagined he would miss her. She said she had never suffered so much in her life.
From her wheelchair, she continued to fight for what she believed in. She never ceased to spread messages of equality and to depict taboo subjects (sex, desire, infertility) in her paintings. Weakened by pneumonia, Kahlo passed away in 1954, but the last words she wrote in her aroused suspicions among those around her: “I joyfully await the exit – and I hope never to return.” It was suspected that she committed suicide, which would have allowed her to put an end to her pain.
She remains a woman brimming with life and a flamboyant model of strength and independence, who fought for her convictions until her list birth. Her final painting bears witness to this: she inscribed “viva la vida” among the vibrant colours which illuminate the work. These vivid colours may be seen as paradoxical when compared with the struggles which occupied her existence. Through this message of optimism, Kahlo demonstrates that whatever difficulties someone experiences, life is still worth living.
Frida Kahlo: an icon of feminism
Throughout her life, Frida Kahlo was the embodiment of a true icon of feminism. She opposed the inflexibility of Mexican society, which was resistant to the emancipation of women. She constructed her myth alone, with the help of her strong personality and uncompromising character.
Kahlo was an atheist in very Catholic Mexico, and it was from here that she begun her originality and opposition to the norm. She called herself bisexual in a society attached to old-fashioned values, against what she could call “deviance”.
The political situation in Mexico had a great influence on her work. Revolution was taking place between 1910 and 1920. The country emerged from this period weakened in every domain: economic, social and political. From 1920 to 1924, the situation in Mexico began to stabilise, but inequalities persisted. After this period, distinction based on gender came back in full force. Women, who had contributed during this hostile period and succeeded in breaking into the political sphere, were reduced to their former role as housewives. This was the return of enslavement, to domestic tasks, to the absence of rights. They had no access to political life or education. At a young age, Kahlo refused to submit to this, and took up the fight against gender stereotypes.
Her preferred weapon: painting. Through her art, she was able to raise awareness, provoke and express herself freely. Each of her paintings allowed her to talk about taboo subjects, which nobody had dared to talk about before that point. From sex to abortion, miscarriages and even depression, her paintings illustrate the experiences forming the life of a woman. They allowed her to shine a light on the suffering of women and the trials they may face, that the men at the time had difficulty understanding. Kahlo placed no limits on herself; she even depicted genitalia. Many would criticise her work and would view it is a mark of vulgarity and indecency. André Breton, avant-gardist and surrealist, declared that “her art is a ribbon around a bomb”.
One thing which allows us to pick the Mexican artist out of a crowd is her monobrow and small moustache. She didn’t hide them, quite the opposite; she used them to make an impression in this world where women are subjugated and victims of social pressures. They had to look like an imposed ideal and for this reason, any sign of masculinity was unwelcome. Kahlo showed that her hair did not prevent her beauty, she that embodies elegance and femininity. The Mexican wanted to liberate herself from the standard of the perfect woman, which had to fulfil many criteria. She disturbed people with her nonconformist values. In certain family photos, she is seen wearing an outfit otherwise reserved for men. Rivera, in one of his paintings, had even depicted her with a cigarette in her mouth and bottle of Tequila in hand. A woman with a weakness for alcoholic drinks was more than looked down upon, but Kahlo liked this and did not hide it: “I drank to drown my sorrows, but the damned things learned how to swim.”
The cause of feminism and her combat in favour of minorities were not the only battles that Kahlo undertook. “Mexicanidad” (“mexicanness”), the acceptance of her roots and of her identity as a Mexican woman, was also on this already exhaustive list. After the long years of revolution and economic, political and social chaos in Mexico, it was necessary to reconstruct the country and make sure that the residents were proud of their roots. This is why, in 1942, she became a member of the “Seminario de Cultura Mexicana”, an organisation created by the Minister of Cultural Affairs. The organisation aimed to encourage the spread of Mexican culture by way of exhibitions and other cultural meetings which would represent the country’s tradition.
Frida Kahlo, through her masterful and striking works, her commitment to the cause of the rights of women and minorities, through the courage she showed during a life of challenges, is today firmly rooted among the numerous women who have fought for equality. The painful trials she was forced to face gave her the strength to face any difficulty. Her provocative paintings became a symbol of open-mindedness and freedom. Until her last breath, Kahlo, through painting the most meaningful phases of her life, was aware of being the spokesperson for all the women who were denied the right to express themselves.