ISN: Positions in Social Networks

Positions in a network are important for different reasons such as well-being. In the following several concepts will be introduced to gauge positions in a social networks.

Structural balance

People prefer balanced relationship structures. According to Heider (Heider, 1946), imbalances cause psychological distress. To balance people create or drop ties. However, balance may not be equally important.

Structural holes

A structural hole means being between two other actors with the only transitive connection between them passing through one (Burt, 2009). In a sturctural hole, one is exposed to different views. However, network brokerage is a probability and may not guarantee advantages.

Embeddedness of ties

A tie embedded in a triad with two additional strong ties is called a Simmelian tie. They are supposed to be more powerful as they enforce solidarity and protect from malfeasance.

Social capital

In general, a transferable capital that is inherent to the connections between people (Bourdieu, 1986; Coleman, 1988).

Safety and Effectance

Safety includes fulfilment of emotional needs such as trust and reputation (e.g. embeddedness. Effectance on the other hand means to learn new things and being autonomous (e.g. brokerage through structural holes).

The ties that torture

Being in a structural hole between two Simmelian ties means to have to uphold two different social constraints at once that may even be contradictory (Krackhardt, 1999).

The strength of the weak tie

Week ties connect one to networks with different information that allows one to acquire new knowledge (Granovetter, 1973).

Embeddedness of economic actions

Economic actions are embedded in social relations. This, the constrained options of actors to engage in interaction are taken into account. The action between economic actors depend on the type, strength and embeddedness of a relationship.


Bourdieu, P. (1986). The Forms of Capital. In J. G. Richardson (Ed.), Handbook of Theory and Research for the Sociology of Education (pp. 241–58). Greenwood Press.
Burt, R. S. (2009). Structural holes: The social structure of competition. Harvard university press.
Coleman, J. S. (1988). Social capital in the creation of human capital. American Journal of Sociology, 94, 95–120.
Granovetter, M. S. (1973). The strength of weak ties. American Journal of Sociology, 78(6), 1360–1380.
Heider, F. (1946). Attitudes and cognitive organization. The Journal of Psychology, 21(1), 107–112.
Krackhardt, D. (1999). The ties that torture: Simmelian tie analysis in organizations. Research in the Sociology of Organizations, 16(1), 183–210.

PE: Bureaucracy theory

We will be looking at the political process in an exogenous political environment. Policies are demanded for by citizens/voters and interest groups whereas it is supplied by delegates/representatives/politicians and public administration. Public administration is claimed to be motived by either rent-seeking or community-engaging.

A central question becomes how to measure the quality of government of public services. Four typical approaches are surveys, input-output comparison, efficiency frontier (how well/efficient do they perform compared to an optimum), and efficiency and effectiveness measures.

Economic Theory of Bureaucracy

A homo oeconomicus optimises his/her own utility. Whereas an entrepreneur simply optimises profit, a bureaucrat is forced to pursue self-interest within the institutional constraints often excluding economic profit. Max Weber (Weber, 2002) argues that a natural object of a bureaucrat is power. Russel (Russell, 2004) offers three subdivisions: direct physical power, rewards and punishment, and influence on opinion. Only under uncertainty a potential arises to exert the last type of power, whereas information creates the opportunity to actually do so.

Power allows for personal advantages and bureaucrats accrue non-monetary benefits, job security, jobs for relatives, additional benefits, work efforts, reputation, and more. To justify/hide these advantages, nonpecuniary goals of a bureaucrat become size of the bureaucracy, slack within the bureaucracy and risk-aversion.

To analyse bureaucracy usually a two actor environment is observed. A sponsor how delegates a task and a bureau that executes and delivers results. Issues arise due to conflicting interests (providing public good versus personal advantages) and information asymmetry (true cost is only known to the bureau).  Subsequently, measuring issues  create a monitoring problem. Usually, only the activity of a bureau can be observed, but not the output (e.g. national defence, education). As the sponsor is monopsonist buyer (on behalf of society) and the bureau is a monopolistic supplier (to circumvent wasteful duplication), efficiency is not required and information cannot be sourced from an alternative.

Observations in reality have shown that bureaucrats wages are unrelated to efficiency and that any form of performance dependent pay would be hard to measure.

Budget-maximising bureaucrat

Proposed by Niskanen (Niskanen, 1971) , it constitutes a simple model that assumes:

  1. Personal interests of bureaucrats are followed by maximising the budget
  2. The bureau has a monopoly position
  3. The cost function is not known by the sponsor
  4. The bureau can make all-or-nothing budget proposals

It follows that their budget [latex]B[/latex] depends on the perceived output level [latex]Q[/latex] ([latex]B=P(Q); B’>0; B”\leq 0[/latex]) and that their costs [latex]C[/latex]  depend on the output level [latex]Q[/latex] ([latex]C=C(Q); C’>0; C”\geq 0[/latex]).

A bureaucrat then has a Lagrangian objective function [latex]O_B=B(Q)+\lambda(B(Q)-C(Q))[/latex] where the Lagrangian multiplier [latex]\lambda[/latex] represents the marginal utility of an expansion of the budget constraints to the bureau and hence is positive. When differentiated and solved for zero one obtains [latex]B'(Q)=\frac{\lambda}{1+\lambda}C'(Q)[/latex] and [latex]B(Q)=C(Q)[/latex]. The optimal outcome for society would be [latex]B'(Q)=C'(Q)[/latex].

The bureaucrat can know the sponsors social surplus and compute and request a budget that reduces the surplus to zero, unless [latex]B’\leq0[/latex]. The additional condition explains a bureau’s infringement into other domains to justify increasing the budget.

Alternative institutional assumptions

Any of the four assumptions can be relaxed. Relaxing assumption 4, a sponsor could require more than one budget proposal with different levels of activities, thereby weakening the agenda-setting role of bureaucrats. Bureaucrats must announce the price [latex]P[/latex] at which  it will supply  a level [latex]Q[/latex] that will be subsequently set by the sponsor. A bureau now chooses [latex]P[/latex] that maximises [latex]B[/latex]. The demand elasticity [latex]\eta=\frac{P}{Q}\frac{dQ}{dP}[/latex] can be used by the bureaucrat to choose the largest budget. Under the assumption of linear demand schedule, constant marginal costs, and known sponsor demand, a bureau will ask for a price [latex]P[/latex] such that [latex]\eta=1[/latex] as long as this is higher  than its marginal costs.

Other assumptions can be challenged as well. Monitoring can be introduced (relaxing assumption 3) where investigative bodies sift through the expenses of public bodies and eventually unveil oversized budgets. Risk-aversion has been ignored so far. The sponsor could conceal its demand or a market for a service (i.e. competing bureaus) could be introduced.

Alternative behavioural assumptions

On the one hand, a slack-maximising bureaucrat wants to maximise x-inefficiency to increase his/her gain relative to the service provided. The minimally accepted service is provided at the indifference optimum below the social optimum such that a larger share of the budget can be acquired. On the other hand, a risk-averse bureaucrat would choose actions with the lowest potential penalties. Therefore, avoidance of action often occurs.

Other behavioural considerations that influence a bureaucrats behaviour can be crowding-out effects of intrinsic motivation by monitoring, social norms, availability bias (work only on salient risks), and lacking feedback-mechanisms.

Counter arguments

Promotions in bureaucracies are highly competitive and require good “track records” of the respective bureaucrat, so there is a market for bureaucrats. The discretionary power is actually lower than in the private sector and therefore inefficiency may not be properly judged. Lastly, (democratic) government are under (re-election) pressure and therefore are keen on monitoring the efficiency of a bureaucracy.

Power of the agenda setter

The agenda setter can prepare a choice where the non-acceptance of an oversized budget would result in the under-provision of a service. The ability to setup the agenda therefore results in power.

Control of the public sector

To reduce excesses in the public sector, usually, some form of competition is introduced (e.g. between administrative units, by private services), tightening of budget constraints (e.g. designated taxes (earmarking), limited tax base, and auditing), political restrictions (e.g. direct votes, separation of administration and politics), and rewards and punishment.


Niskanen, W. A. (1971). Bureaucracy and representative government. Transaction Publishers.
Russell, B. (2004). Power: A new social analysis. Psychology Press.
Weber, M. (2002). Wirtschaft und gesellschaft: Grundriss der verstehenden Soziologie. Mohr Siebeck.

IAP: Netheads and Bellheads

In the 1990s the great debates on how the Internet should be developed was coined the Netheads versus Bellheads. Netheads originated from the people that developed network technology whereas Bellheads originates from the Bell Laboratories – a research institution of telecommunication companies. At the core was a technical discussion whether packet-switching or circuit-switching is more useful and how big the meta-data overhead should be and how long the setup of a connection takes. However, technological development over the next decade rendered the debates irrelevant. In retrospect, it also seems to have been easier to expand the network under the internet with IP (just assign an address) whereas ATM (establish circuits). Also, the technical specifications of ATM turned very complicated to the point where they were too cumbersome.

The technical debates where just a superficial expression of an underlying discussion. The Netheads viewed the Internet from a data perspective whereas Bellheads viewed the Internet from a (continuous) signal perspective. Another issue was that Netheads came from a young industry with little to no corporate backing (market entrant)  whereas Bellheads had a century of corporate history behind their back (incumbent). The Netheads nearly waged a crusade.  It can be viewed through the US political focus as well where Netheads lean liberal and Bellheads lean conservative.

CSD: Space Syntax Theory

Space syntax is a social theory on the use of space. It encompasses a set of theories and techniques that examines relationships between man (e.g. individual/user/society) and the environment (in/outdoor).

Recommended basic readings are Lynch’s “The image of the city” (Lynch, 1960) as well as “Space is the machine” (Hillier, 2007) . Advanced readings are “The social Logic of Space” (Hillier & Hanson, 1989) , which also introduced Space Syntax.

Spatial Configuration

Spatial configuration defines how the relation between spaces A and B is modified by their relation to space C (Hillier, 2007) .

Representation of Space

Isovists, also called a view shed in geography, are the volume/area of the 360° field of view from particular points of view.  Lines of sight are used to construct the isovists. Partial isovists are also constructed to mimic the human field of view. Psychologists have suggested (but not yet quite proven) that the shapes of isovist polygons influence the behaviour of humans. Each point of view generates its own isovist. Visibility Graph Analysis converts the set of isovists into a measure of visibility.

When people move, they like to move in straight lines (confirmed in Spatial Cognition). Axial lines provide a potential for long straight lines which could be walked upon. Typical analysis chooses the minimal set of longest axial lines that allows to see the complete space.

Major assumptions of space syntax assume that people move in lines (axial lines), see changes in visual fields (VGA), and interact in convex space (which is not covered).

Measuring centrality and graphs

To convert a road network into a graph. The roads are taken as nodes and connections between roads are edges. Curves are replaced by a set of lines that mimic the curvature. Segment angular analysis splits roads into segments (according to connections to other roads), however, additionally the connections are weighted by changes of direction. Essentially, degree centrality is measured. Also, other measures of network centrality are used (see previous link). Closeness Centrality is called Integration in Space Syntax. Betweenness Centrality is called Choice in Space Syntax. Other centrality measures are currently not applied in Space Syntax.


Hillier, B. (2007). Space is the machine: a configurational theory of architecture. Space Syntax.
Hillier, B., & Hanson, J. (1989). The social logic of space. Cambridge university press.
Lynch, K. (1960). The image of the city. MIT press.

SMADSC: Social Networks

Social networks often give structure to relations. They can be considered as abstract, mathematically, tractable and computationally instantiatable systems. Social networks have become a field of their own. It is very interdisciplinary touching mathematics (graph theory), computer science (algorithms), sociology (population group trends), psychology (individual and social behaviour), and complex network theory.

Interpersonal contact caused social networks to emerge. It can be understood as a descriptor for social trends (Cioffi-Revilla, 2013) . The basic elements are Nodes (units of observation), Edges (relationships), and Aggregations (Dyads, Triads, Clique, Clusters, etc.). More advanced elements are Descriptive Properties (e.g. centrality measures).

A network can also be seen as an abstract topology and “social glue”. Agents can move around the network, by jumping from node to node, either there is a connecting edge or in general. Alternatively, nodes can be mapped onto agents, either by allowing agents to move around a raster or along the edges.

A network trades off regularity and complexity, relative size and relative complexity as well as network complexity and network connectivity.

Social Network Analysis

Social Network Analysis (SNA) is based on a machine-readable representation of a social network, i.e. an adjacency matrix. While there is no “best measure” to describe a node or edge, there are several useful descriptive properties.

Bridging and spanning nodes can be identified. Also,cliques and clusters can be identified which gives a relative density of the network. Lastly, measures  of relative Connectedness and Centrality are often used (see this post).

Social Psychology

Instead of observing the network as a whole. It can be analysed from the node perspective. Nodes can be grouped into a “self” (ego) or “other (alter). The “self”‘s purpose  is “self-motivated” action relative to their role and their subjective network knowledge. If nodes are “other” then their function is that of an arbiter or reactive agent. In this view, edges represent social connectivity in the network. They represent evidence of physical, informational, and or some other material or non-material transfer or contact between nodes. Typically, the edges suggest some social binding between individuals and/or groups of nodes. Finally, an edge often connotes implicit temporal properties. Dyads are any two connected nodes in the network, whereas triads are any three connected nodes, whereas cliques are larger. Simmelian ties are strong, bidirectional social bindings.

ISN: Network visualisation

Today’s topic will be to visualise networks and centrality measures. We visualise a network to better understand the underlying data. A visualisation should be driven by the question that we would like to answer. Nonetheless, visualisations are by their nature exploratory. Also, visualisations do not provide evidence for hypothesis.

Visualisation usually tries to convey information by the layout. Density tries to convey cohesion. Distance tries to convey graph-theoretic distance, tie length tries to convey attached values. Geometric symmetries try to convey structural symmetries.

General rules of graph visualisation is that no edge crossing, overlap, asymmetry or meaningless edge ledge/node side should occur.

Visualisation in R

We will use either the “igraph” or “sna” library to visualise the data.

PE: Public Good Game

Public Good Game

Each subject secretly chooses how much of their initial endowment to put into a public pot. The joint value in this pot is multiplied by a factor ([latex] 1 < factor < N [/latex]) and evenly paid out across all [latex]N[/latex] subjects. All unspent endowments is kept by the respective subject. In one-shot games a non-cooperative strategy is usually applied. In infinitely repeated games, subjects eventually cooperate.

However, experimental results are not in line with predictions of rational choice theory. Even in one-shot, two person prisoners’ dilemma games half the participants cooperate. Voluntary contributions to public goods fall if the game repeats and therefore participants have been called “adaptive egoists” (Mueller, 2003) or “conditionally cooperative”(Gächter, 2006).

People seem to be contributing for “warm glow” preference (utility from contributing) (Palfrey & Prisbrey, 1997), altruistic preferences (want to increase other’s utility) or due to error and learning (testing the best approach).

Gächter found that voluntary cooperation is fragile, that there exists social interaction effects in voluntary cooperation, group composition matters (likemindedness), and that management of believes matters. Also, path dependency has been observed where the first round is the most important.

In other field experiments, students were informed how many others contributed to social funds (64% versus 46%) and were influenced by the numbers (Frey & Meier, 2004).

The cooperative environment (Ostrom, 1998)

Models of complete rationality do not work well in non-competitive situations. Verbal communication can improve trust and allow groups to reciprocate. There is also some capacity to solve second-order social dilemma that change the structure of the first-order dilemma (e.g. institute punishment). Ostrom proposed a model of bounded rationality where individuals use heuristics, norms and rules to improve outcomes of non-competitive, not frequently repeated situations.

According to Ostrom, reciprocity, reputation, and trust can overcome strong temptations of short-run self-interest. Consequently, a self-reinforceing process can be created that increases the level of cooperation and leads to higher net benefits for all, but it is very fragile.

The system is stable if it has a small size, symmetry of assets and resources, long time horizon, and a low-cost production function. As the group size increases, marginal gains from contributing falls and it is more difficult to identify and punish defectors. As the stakes increase, cooperation is reduced. Other issues that may arise are: monitoring becomes more costly, economies of scale may not apply and marginalisation behaviour within the group may be questions (is it fair punishment?).

Current Research (Lanz, Wurlod, Panzone, & Swanson, 2017)

A field experiment in supermarkets in greater London area to compare quantitative impact of three measures to reduce footprint of consumption (welfare analysis). The experiment tested conditions on four types of goods: soda, milk, spreads, and meat. Three conditions were used:

  1. Information label
  2. Pigouvian tax based on relatively higher footprint of “dirty-type” product alternative
  3. Neutrally framed price change.

The main findings includes effectiveness of all policy interventions is higher if substitutability is higher and the motivation crowding out due to taxation relevant only for low effort products.

Critiques were raised since it was a “one-shot” game testing “warm glow” and it required a state-solution.

Cognitive biases

Way of thinking that can lead to systematic deviations from the benchmark of rationality. The concept originates from psychology and behavioural economics. Classic biases include availability heuristics (remembered events seem more likely), confirmation bias (consume new information as complementing preconceptions), endowment effect (ownerships changes value perception), and framing effect (the presentation affects the conclusion).

Preference Aggregation

The problem with the state without agency and the state as a nexus of cooperation (Acemoglu, 2009) is that the government is treated as a black box into which takes as input individual preferences and provides as output a collective choice. Arrow’s Impossibility Theorem  (Arrow, 1963) shows that “the only voting method that is not flawed is a dictatorship.” Arrow stipulates that even under reasonable requirements there can’t exist a ranked voting system  (i.e. social welfare function or preference aggregation rule) that transforms the set of preferences into a single global societal order for two or more participants with three or more option. Arrow specifies reasonable requirements as:

  1. Non-dictatorship
  2. Universality (unique and complete ranking)
  3. Independence of irrelevant alternatives
  4. Pareto efficient (unanimity)

Pairwise voting alternatives do not lead to a complete ordering. Only through violation of reasonable requirements (dictatorship, Kaldor-Hicks efficiency, non-universality) can preference aggregation be performed. Cox  (Cox & McCubbins, 2000) showed that voting rules shape policy outcomes even if the voter preference is fixed.

Nudges or “Libertarian paternalism”

A nudge can be understood as a change in the choice context (that would be irrelevant to the homo oeconomicus) to intentionally steer real-world agents’ behaviour in the direction of the homo oeconomicus benchmark (see organ donor debate). Critiques of such “behavioural welfare economics” are again that no global social preference exists and that without appropriate information manipulation could occur. It also implies a benevolent paternalist state, which is debatable.

Endogenous versus exogenous institutions

The difference between whether on asses how institutions arose (endogenous) or how institutions impact policy (exogenous). This course focuses on the latter.


Acemoglu, D. (2009). Political economy lecture notes. Retrieved February 27, 2017, from
Arrow, K. J. (1963). Social Choice and Individual Values. New York: Wiley.
Cox, G. W., & McCubbins, M. D. (2000). Political structure and economic policy: The institutional determinants of policy outcomes. In Presidents, Parliaments and Policy (pp. 21–96). Cambridge University Press.
Frey, B. S., & Meier, S. (2004). Social comparisons and pro-social behavior: Testing “conditional cooperation” in a field experiment. The American Economic Review, 94(5), 1717–1722.
Gächter, S. (2006). Conditional cooperation: Behavioral regularities from the lab and the field and their policy implications. CeDEX Discussion Paper, 3.
Lanz, B., Wurlod, J.-D., Panzone, L., & Swanson, T. (2017). The behavioural effect of Pigovian regulation: Evidence from a field experiment. IRENE Working Paper.
Mueller, D. C. (2003). Public Choice. Springer.
Ostrom, E. (1998). A behavioral approach to the rational choice theory of collective action: Presidential address, American Political Science Association. American Political Science Review, 1, 1–22.
Palfrey, T. R., & Prisbrey, J. E. (1997). Anomalous behavior in public goods experiments: How much and why? The American Economic Review, 829–846.

ASC: Introduction

 Argumentation and Science Communication will discuss how scientific arguments are made and how they are eventually communicated.

The first weeks readings are listed in the references (Bradley & Steele, 2015; Lempert, Nakicenovic, Sarewitz, & Schlesinger, 2004). A particular focus will be on (Mueller, 2010), for which the following questions should be answered:

  1. What is a computer model?
  2. What are the basic reasons for limits/errors/uncertainties of (climate) models or model predictions? Which ones, do you think, are most important, and for which situations?
  3. What kind of results are generated by models?
  4. What are reasons for and against the claim that computer models inform us about the world?

Scientific knowledge makes sense in its context, but communicating it across fields or indeed beyond science makes it necessary to pick an appropriate language. There is another issue: science is asked to be informative rather than prescriptive, however, usually it is presented/perceived in a prescriptive manner.


Argumentation is needed in policy analysis because decisions are made under deep uncertainty. Argumentation analysis is a philosophical method to address the uncertainty.

Predict-then-act (Lempert, Nakicenovic, Sarewitz, & Schlesinger, 2004)

A rational decision for policy alternatives which are ranked on the basis of their expected utility, contingent on the probabilities of alternative future states of the world. Under Rational Choice Theory the outcome with the highest utility would be picked. Rational Choice would therefore require no democratic deliberation, if science is done properly. However, each step of the scientific endeavour requires deliberation. Science itself is agenda-setting and therefore cannot be performed unquestioned. The rational choice assumption are not fulfilled.

 [Deep uncertainty exists when] decision-makers do not know or cannot agree on: (i) the system models, (ii) the prior probability distributions for inputs to the system model(s) an their inter-dependencies, and/or (iii) the value system(s) used to rank alternatives. (Lempert et al., 2004)

Such a reductive approach is an idealisation that abstracts many aspects. On the upside, it is a smart formal approach that can represent  diverse measures within one holistic measure (expected utility). On the downside, it may not matter for a decision (different statistical value of life for different countries) or it does not apply (requirements may not e fulfilled due to a lack of knowledge beyond information about outcomes).

Summa summarum, rational choice should be seen as a special case, rather than a general paradigm to conceive a policy decision problem. Argumentation is necessary to delineate the problem and frame the options, characterise uncertainties of outcomes, characterise value uncertainties, evaluate uncertainties, and deliberate a fair decision from plural perspectives (Hansson & Hirsch Hadorn, 2016).


Philosophical methods address two questions in a systematic way:

  1. What do you mean? Answer requires an analysis of respective concepts.
  2. How do you know? Answer requires an analysis of respective arguments.

Philosophy does not generate new knowledge, but refers to what is known. It makes a relevant distinction and describes consequences for normative claims about knowledge (epistemology) and action (ethics). It points at the limits of what we can belief and therefore can be considered a method for critical thinking.

Critical Thinking

The goal is to argue, communicate and act responsibly. It requires the ability to accept criticism as well as a willingness to reach an understanding. Based on expertise in a scientific field or regarding real-world problems, critical thinking tries to apply cognitive skills to answer the two basic questions of philosophy mentioned above.

Short Guide to Analysing Texts (Brun & Hirsch Hadorn, 2014)

A guide that structures analysing texts in 5 steps:

  1. Rules for Working and Principles of Understanding
    • Refer explicitly to the analysed text.
    • Write down results and your main reasoning.
    • Develop text analyses and results which are understandable to others.
    • Give reasons for your text analyses/results.
    • Read texts several times to test and revise understanding of both the parts and the whole.
    • Start with the assumption that the author makes true statements and gives sound arguments and go on to test this assumption (principle of charity).
  2. Preparing the Text Analysis: How to Proceed
    • Gather basic information about the text: Authors (ghostwriting?), publishing (scientific or popular?), topic, context of writing and publishing, target readership, type/function of text, and impact
  3. Reading: How to Work on the Text
  4. Structuring: How to Analyse the Structure of the Text
    • Divide the text into smaller passages (e.g. paragraphs) and number them consecutively
    • For every passage of text, consider the following questions:
      1. Content: What is this passage about?
      2. Function: Which function does this passage serve?
      3. Summary: How would you title the passage (content- and function-wise)?
  5. Summarising: How to Capture the Essential
    • Represent concisely essential statements, central arguments and the basic structure of the text.
    • Summaries have to be comprehensible without acquaintance with the original text.
    • Adapt the representation to the aim of your text analysis and to the question it should answer(a short text is not always the optimal solution).


Bradley, R., & Steele, K. (2015). Making climate decisions. Philosophy Compass, 10(11), 799–810.
Brun, G., & Hirsch Hadorn, G. (2014). Textanalyse in den Wissenschaften: Inhalte und Argumente analysieren und verstehen. vdf Hochschulverlag AG.
Hansson, S. O., & Hirsch Hadorn, G. (2016). The Argumentative Turn in Policy Analysis. Springer International Publishing.
Lempert, R., Nakicenovic, N., Sarewitz, D., & Schlesinger, M. (2004). Characterizing climate-change uncertainties for decision-makers. An editorial essay. Climatic Change, 65(1), 1–9.
Mueller, P. (2010). Constructing climate knowledge with computer models. Wiley Interdisciplinary Reviews: Climate Change, 1(4), 565–580.

Urban Design II: Los Angeles

Today’s topic will be the Urban Design of Los Angeles. The main tools will be top-down infrastructure (Ecology/landscape), fragmented sub-urban (suburbia) and places for experimentation (micro/temporary programs).

Los Angeles is a car city. It is the antagonist to New York, the incarnation of the battle between the East and West Coast. Hollywood is located in Los Angeles and Hollywood produces the modern understanding of America.

Los Angeles is also a (horizontal) grid city. The city grew out of property speculation with Asian migrants building the railways in hope of a better future. Architects came to Los Angeles to build a small bungalow or two. Los Angeles is multiple cities in one place (not geographically, but imaginary).

Los Angeles is also a Postmodern City. The modern idea ended with the Second World Ware. Postmodernity interweaves the past with the present and expectations of the future in contrast to Modern concepts of Communism, Scientific endeavour and Socialism.

Ed Soja describes Los Angeles as highly fragmented and inducing a feeling of being lost and being dislocated epitomised in the Bonaventura. Conventional understandings of a city are questioned by Los Angeles. Conventional standards of planning do not apply. Individuals, pressure groups and planning authorities vie for the decision how to develop the city. The non-planning has become a characteristic of the city. Frank Gehry is a product of Los Angeles. Los Angelites spend 1 month per year in their car.

Los Angeles is also plagued by skid rows (homeless housing in tents) and riots, both forming the character of the city. The Mexican past of the city mixes with the American identity and incoming Asian cultures (Korean, Chinese and Japanese).

The city of Los Angeles has a density of 3000 inhabitants per squarekilometer with 4 million inhabitants, whereas the county of Los Angeles drops to 900 inhabitants per squarekilometer at 10 million inhabitants. The greater Los Angeles incorporates neighbouring counties and the number of inhabitants rises to 18 million.

Top-down Infrastructure

Los Angeles laid down the grid and people were invited to come to Los Angeles. Real estate taxes drove the engine of Los Angeles. 160 administrative subdivision are contained within the 5 counties Greater Los Angeles covers.

Los Angeles was unexplored and presented itself as a desirable opportunity. Route 66 was the road from Chicago to Los Angeles that people travelled in pursue of happiness. Los Angeles was sold as an antidote to  the urban city by promoting the suburban. Trams connected the suburbs to the centre, but soon where complimented and overtaken by highways. Within 10 years in the 1920s Los Angeles grew with the incoming migrants from empty fields to full-blown city. As a side-note, this is happening in most major cities in developing countries from China over India to Nigeria. Los Angeles also planned quarters that enforced the segregation.

Los Angeles had no notable economy before 1920, but soon produced 25% of US oil and soon expanded its importance via the film industry.

The floodwater infrastructure cuts through the city and carries valuable water to the ocean. The hostile environment (swamp, desert, mountains) made occupying the area a difficult task. Vast and strategic infrastructure is necessary to work. Water is imported through massive aqueducts from Colorado. Electricity is imported with the longest high-voltage electricity line from the Pacific Northwest. The hot summer makes people use air condition on an incredible scale. The interstate highway connection to New York binds it to the rest of the US. The former Los Angeles River has been converted from meandering to a straight line. Most of the time it is an empty 8 meter gap in the city, but during strong down pours it may completely fill up with floodwater. Graffiti artists reclaim the floodwater systems by filling them with art. People reclaim the floodwater system with sports and recreation. (Reclaiming) Infrastructure will become a core task of architecture.

Fragmented Sub-urban

Suburbia was an attempt to decentralise cities in the face of nuclear threats in the cold war. Los Angeles has no strong centralised core, but equally spans into all directions. The fragmentation ensues as all suburban areas create their own nucleus.

Pacific Electric Railway connected the disparate suburbia and enabled movement, followed by the highway.  Industry gathered along the river and fragmented the city even more: industrial suburbs.

Hollywood is a suburb. The suburb was projected to the rest of the US with the help of the film industry. The suburb was also a dream where each hard-working American can have a house.

The suburban and exurban has become urban as the density of suburbia and the communication technologies removes any distance. Modern cities are hybrid, both urban and non-urban.

Places for experimentation

New architecture was tested in Los Angeles. Many architects went to Los Angeles to build bungalows and tested new approaches how to built. Los Angeles is the playground of architecture. The Lovell beach house was built in the 1930s and looks like contemporary 2010s architecture.

Lovell Beach House

Lovell Beach House

Architecture of single buildings is the first step towards Urban Design. Houses are the atomic elements of Urbanism. Frank Lloyd Wright becomes the greatest American architect and inspires many architects in Los Angeles giving the Urban Characteristic to the city. Architects pick up elements of the past, mix them with their interpretation and creating buildings for the future in the present. Los Angeles was the core of Postmodern Architecture in the US. Experimental Architecture becomes a core concept to create new architecture put forward by Gehry such as the Disney Concert Hall.

Disney Concert Hall

Disney Concert Hall