PE: Public Good Game

Public Good Game

Each subject secretly chooses how much of their initial endowment to put into a public pot. The joint value in this pot is multiplied by a factor ( 1 < factor < N ) and evenly paid out across all N subjects. All unspent endowments is kept by the respective subject. In one-shot games a non-cooperative strategy is usually applied. In infinitely repeated games, subjects eventually cooperate.

However, experimental results are not in line with predictions of rational choice theory. Even in one-shot, two person prisoners’ dilemma games half the participants cooperate. Voluntary contributions to public goods fall if the game repeats and therefore participants have been called “adaptive egoists” (Mueller, 2003) or “conditionally cooperative”(Gächter, 2006).

People seem to be contributing for “warm glow” preference (utility from contributing) (Palfrey & Prisbrey, 1997), altruistic preferences (want to increase other’s utility) or due to error and learning (testing the best approach).

Gächter found that voluntary cooperation is fragile, that there exists social interaction effects in voluntary cooperation, group composition matters (likemindedness), and that management of believes matters. Also, path dependency has been observed where the first round is the most important.

In other field experiments, students were informed how many others contributed to social funds (64% versus 46%) and were influenced by the numbers (Frey & Meier, 2004).

The cooperative environment (Ostrom, 1998)

Models of complete rationality do not work well in non-competitive situations. Verbal communication can improve trust and allow groups to reciprocate. There is also some capacity to solve second-order social dilemma that change the structure of the first-order dilemma (e.g. institute punishment). Ostrom proposed a model of bounded rationality where individuals use heuristics, norms and rules to improve outcomes of non-competitive, not frequently repeated situations.

According to Ostrom, reciprocity, reputation, and trust can overcome strong temptations of short-run self-interest. Consequently, a self-reinforceing process can be created that increases the level of cooperation and leads to higher net benefits for all, but it is very fragile.

The system is stable if it has a small size, symmetry of assets and resources, long time horizon, and a low-cost production function. As the group size increases, marginal gains from contributing falls and it is more difficult to identify and punish defectors. As the stakes increase, cooperation is reduced. Other issues that may arise are: monitoring becomes more costly, economies of scale may not apply and marginalisation behaviour within the group may be questions (is it fair punishment?).

Current Research (Lanz, Wurlod, Panzone, & Swanson, 2017)

A field experiment in supermarkets in greater London area to compare quantitative impact of three measures to reduce footprint of consumption (welfare analysis). The experiment tested conditions on four types of goods: soda, milk, spreads, and meat. Three conditions were used:

  1. Information label
  2. Pigouvian tax based on relatively higher footprint of “dirty-type” product alternative
  3. Neutrally framed price change.

The main findings includes effectiveness of all policy interventions is higher if substitutability is higher and the motivation crowding out due to taxation relevant only for low effort products.

Critiques were raised since it was a “one-shot” game testing “warm glow” and it required a state-solution.

Cognitive biases

Way of thinking that can lead to systematic deviations from the benchmark of rationality. The concept originates from psychology and behavioural economics. Classic biases include availability heuristics (remembered events seem more likely), confirmation bias (consume new information as complementing preconceptions), endowment effect (ownerships changes value perception), and framing effect (the presentation affects the conclusion).

Preference Aggregation

The problem with the state without agency and the state as a nexus of cooperation (Acemoglu, 2009) is that the government is treated as a black box into which takes as input individual preferences and provides as output a collective choice. Arrow’s Impossibility Theorem  (Arrow, 1963) shows that “the only voting method that is not flawed is a dictatorship.” Arrow stipulates that even under reasonable requirements there can’t exist a ranked voting system  (i.e. social welfare function or preference aggregation rule) that transforms the set of preferences into a single global societal order for two or more participants with three or more option. Arrow specifies reasonable requirements as:

  1. Non-dictatorship
  2. Universality (unique and complete ranking)
  3. Independence of irrelevant alternatives
  4. Pareto efficient (unanimity)

Pairwise voting alternatives do not lead to a complete ordering. Only through violation of reasonable requirements (dictatorship, Kaldor-Hicks efficiency, non-universality) can preference aggregation be performed. Cox  (Cox & McCubbins, 2000) showed that voting rules shape policy outcomes even if the voter preference is fixed.

Nudges or “Libertarian paternalism”

A nudge can be understood as a change in the choice context (that would be irrelevant to the homo oeconomicus) to intentionally steer real-world agents’ behaviour in the direction of the homo oeconomicus benchmark (see organ donor debate). Critiques of such “behavioural welfare economics” are again that no global social preference exists and that without appropriate information manipulation could occur. It also implies a benevolent paternalist state, which is debatable.

Endogenous versus exogenous institutions

The difference between whether on asses how institutions arose (endogenous) or how institutions impact policy (exogenous). This course focuses on the latter.


Acemoglu, D. (2009). Political economy lecture notes. Retrieved February 27, 2017, from
Arrow, K. J. (1963). Social Choice and Individual Values. New York: Wiley.
Cox, G. W., & McCubbins, M. D. (2000). Political structure and economic policy: The institutional determinants of policy outcomes. In Presidents, Parliaments and Policy (pp. 21–96). Cambridge University Press.
Frey, B. S., & Meier, S. (2004). Social comparisons and pro-social behavior: Testing “conditional cooperation” in a field experiment. The American Economic Review, 94(5), 1717–1722.
Gächter, S. (2006). Conditional cooperation: Behavioral regularities from the lab and the field and their policy implications. CeDEX Discussion Paper, 3.
Lanz, B., Wurlod, J.-D., Panzone, L., & Swanson, T. (2017). The behavioural effect of Pigovian regulation: Evidence from a field experiment. IRENE Working Paper.
Mueller, D. C. (2003). Public Choice. Springer.
Ostrom, E. (1998). A behavioral approach to the rational choice theory of collective action: Presidential address, American Political Science Association. American Political Science Review, 1, 1–22.
Palfrey, T. R., & Prisbrey, J. E. (1997). Anomalous behavior in public goods experiments: How much and why? The American Economic Review, 829–846.


ASC: Introduction

 Argumentation and Science Communication will discuss how scientific arguments are made and how they are eventually communicated.

The first weeks readings are listed in the references (Bradley & Steele, 2015; Lempert, Nakicenovic, Sarewitz, & Schlesinger, 2004). A particular focus will be on (Mueller, 2010), for which the following questions should be answered:

  1. What is a computer model?
  2. What are the basic reasons for limits/errors/uncertainties of (climate) models or model predictions? Which ones, do you think, are most important, and for which situations?
  3. What kind of results are generated by models?
  4. What are reasons for and against the claim that computer models inform us about the world?

Scientific knowledge makes sense in its context, but communicating it across fields or indeed beyond science makes it necessary to pick an appropriate language. There is another issue: science is asked to be informative rather than prescriptive, however, usually it is presented/perceived in a prescriptive manner.


Argumentation is needed in policy analysis because decisions are made under deep uncertainty. Argumentation analysis is a philosophical method to address the uncertainty.

Predict-then-act (Lempert, Nakicenovic, Sarewitz, & Schlesinger, 2004)

A rational decision for policy alternatives which are ranked on the basis of their expected utility, contingent on the probabilities of alternative future states of the world. Under Rational Choice Theory the outcome with the highest utility would be picked. Rational Choice would therefore require no democratic deliberation, if science is done properly. However, each step of the scientific endeavour requires deliberation. Science itself is agenda-setting and therefore cannot be performed unquestioned. The rational choice assumption are not fulfilled.

 [Deep uncertainty exists when] decision-makers do not know or cannot agree on: (i) the system models, (ii) the prior probability distributions for inputs to the system model(s) an their inter-dependencies, and/or (iii) the value system(s) used to rank alternatives. (Lempert et al., 2004)

Such a reductive approach is an idealisation that abstracts many aspects. On the upside, it is a smart formal approach that can represent  diverse measures within one holistic measure (expected utility). On the downside, it may not matter for a decision (different statistical value of life for different countries) or it does not apply (requirements may not e fulfilled due to a lack of knowledge beyond information about outcomes).

Summa summarum, rational choice should be seen as a special case, rather than a general paradigm to conceive a policy decision problem. Argumentation is necessary to delineate the problem and frame the options, characterise uncertainties of outcomes, characterise value uncertainties, evaluate uncertainties, and deliberate a fair decision from plural perspectives (Hansson & Hirsch Hadorn, 2016).


Philosophical methods address two questions in a systematic way:

  1. What do you mean? Answer requires an analysis of respective concepts.
  2. How do you know? Answer requires an analysis of respective arguments.

Philosophy does not generate new knowledge, but refers to what is known. It makes a relevant distinction and describes consequences for normative claims about knowledge (epistemology) and action (ethics). It points at the limits of what we can belief and therefore can be considered a method for critical thinking.

Critical Thinking

The goal is to argue, communicate and act responsibly. It requires the ability to accept criticism as well as a willingness to reach an understanding. Based on expertise in a scientific field or regarding real-world problems, critical thinking tries to apply cognitive skills to answer the two basic questions of philosophy mentioned above.

Short Guide to Analysing Texts (Brun & Hirsch Hadorn, 2014)

A guide that structures analysing texts in 5 steps:

  1. Rules for Working and Principles of Understanding
    • Refer explicitly to the analysed text.
    • Write down results and your main reasoning.
    • Develop text analyses and results which are understandable to others.
    • Give reasons for your text analyses/results.
    • Read texts several times to test and revise understanding of both the parts and the whole.
    • Start with the assumption that the author makes true statements and gives sound arguments and go on to test this assumption (principle of charity).
  2. Preparing the Text Analysis: How to Proceed
    • Gather basic information about the text: Authors (ghostwriting?), publishing (scientific or popular?), topic, context of writing and publishing, target readership, type/function of text, and impact
  3. Reading: How to Work on the Text
  4. Structuring: How to Analyse the Structure of the Text
    • Divide the text into smaller passages (e.g. paragraphs) and number them consecutively
    • For every passage of text, consider the following questions:
      1. Content: What is this passage about?
      2. Function: Which function does this passage serve?
      3. Summary: How would you title the passage (content- and function-wise)?
  5. Summarising: How to Capture the Essential
    • Represent concisely essential statements, central arguments and the basic structure of the text.
    • Summaries have to be comprehensible without acquaintance with the original text.
    • Adapt the representation to the aim of your text analysis and to the question it should answer(a short text is not always the optimal solution).


Bradley, R., & Steele, K. (2015). Making climate decisions. Philosophy Compass, 10(11), 799–810.
Brun, G., & Hirsch Hadorn, G. (2014). Textanalyse in den Wissenschaften: Inhalte und Argumente analysieren und verstehen. vdf Hochschulverlag AG.
Hansson, S. O., & Hirsch Hadorn, G. (2016). The Argumentative Turn in Policy Analysis. Springer International Publishing.
Lempert, R., Nakicenovic, N., Sarewitz, D., & Schlesinger, M. (2004). Characterizing climate-change uncertainties for decision-makers. An editorial essay. Climatic Change, 65(1), 1–9.
Mueller, P. (2010). Constructing climate knowledge with computer models. Wiley Interdisciplinary Reviews: Climate Change, 1(4), 565–580.


Urban Design II: Los Angeles

Today’s topic will be the Urban Design of Los Angeles. The main tools will be top-down infrastructure (Ecology/landscape), fragmented sub-urban (suburbia) and places for experimentation (micro/temporary programs).

Los Angeles is a car city. It is the antagonist to New York, the incarnation of the battle between the East and West Coast. Hollywood is located in Los Angeles and Hollywood produces the modern understanding of America.

Los Angeles is also a (horizontal) grid city. The city grew out of property speculation with Asian migrants building the railways in hope of a better future. Architects came to Los Angeles to build a small bungalow or two. Los Angeles is multiple cities in one place (not geographically, but imaginary).

Los Angeles is also a Postmodern City. The modern idea ended with the Second World Ware. Postmodernity interweaves the past with the present and expectations of the future in contrast to Modern concepts of Communism, Scientific endeavour and Socialism.

Ed Soja describes Los Angeles as highly fragmented and inducing a feeling of being lost and being dislocated epitomised in the Bonaventura. Conventional understandings of a city are questioned by Los Angeles. Conventional standards of planning do not apply. Individuals, pressure groups and planning authorities vie for the decision how to develop the city. The non-planning has become a characteristic of the city. Frank Gehry is a product of Los Angeles. Los Angelites spend 1 month per year in their car.

Los Angeles is also plagued by skid rows (homeless housing in tents) and riots, both forming the character of the city. The Mexican past of the city mixes with the American identity and incoming Asian cultures (Korean, Chinese and Japanese).

The city of Los Angeles has a density of 3000 inhabitants per squarekilometer with 4 million inhabitants, whereas the county of Los Angeles drops to 900 inhabitants per squarekilometer at 10 million inhabitants. The greater Los Angeles incorporates neighbouring counties and the number of inhabitants rises to 18 million.

Top-down Infrastructure

Los Angeles laid down the grid and people were invited to come to Los Angeles. Real estate taxes drove the engine of Los Angeles. 160 administrative subdivision are contained within the 5 counties Greater Los Angeles covers.

Los Angeles was unexplored and presented itself as a desirable opportunity. Route 66 was the road from Chicago to Los Angeles that people travelled in pursue of happiness. Los Angeles was sold as an antidote to  the urban city by promoting the suburban. Trams connected the suburbs to the centre, but soon where complimented and overtaken by highways. Within 10 years in the 1920s Los Angeles grew with the incoming migrants from empty fields to full-blown city. As a side-note, this is happening in most major cities in developing countries from China over India to Nigeria. Los Angeles also planned quarters that enforced the segregation.

Los Angeles had no notable economy before 1920, but soon produced 25% of US oil and soon expanded its importance via the film industry.

The floodwater infrastructure cuts through the city and carries valuable water to the ocean. The hostile environment (swamp, desert, mountains) made occupying the area a difficult task. Vast and strategic infrastructure is necessary to work. Water is imported through massive aqueducts from Colorado. Electricity is imported with the longest high-voltage electricity line from the Pacific Northwest. The hot summer makes people use air condition on an incredible scale. The interstate highway connection to New York binds it to the rest of the US. The former Los Angeles River has been converted from meandering to a straight line. Most of the time it is an empty 8 meter gap in the city, but during strong down pours it may completely fill up with floodwater. Graffiti artists reclaim the floodwater systems by filling them with art. People reclaim the floodwater system with sports and recreation. (Reclaiming) Infrastructure will become a core task of architecture.

Fragmented Sub-urban

Suburbia was an attempt to decentralise cities in the face of nuclear threats in the cold war. Los Angeles has no strong centralised core, but equally spans into all directions. The fragmentation ensues as all suburban areas create their own nucleus.

Pacific Electric Railway connected the disparate suburbia and enabled movement, followed by the highway.  Industry gathered along the river and fragmented the city even more: industrial suburbs.

Hollywood is a suburb. The suburb was projected to the rest of the US with the help of the film industry. The suburb was also a dream where each hard-working American can have a house.

The suburban and exurban has become urban as the density of suburbia and the communication technologies removes any distance. Modern cities are hybrid, both urban and non-urban.

Places for experimentation

New architecture was tested in Los Angeles. Many architects went to Los Angeles to build bungalows and tested new approaches how to built. Los Angeles is the playground of architecture. The Lovell beach house was built in the 1930s and looks like contemporary 2010s architecture.

Lovell Beach House

Lovell Beach House

Architecture of single buildings is the first step towards Urban Design. Houses are the atomic elements of Urbanism. Frank Lloyd Wright becomes the greatest American architect and inspires many architects in Los Angeles giving the Urban Characteristic to the city. Architects pick up elements of the past, mix them with their interpretation and creating buildings for the future in the present. Los Angeles was the core of Postmodern Architecture in the US. Experimental Architecture becomes a core concept to create new architecture put forward by Gehry such as the Disney Concert Hall.

Disney Concert Hall

Disney Concert Hall




IAP: Introduction

The internet is a global-scale, technically complex artefact of immense international social and political importance. It is formed by the interaction of technical constraints (e.g. speed of light, number of addresses), usage models and behaviour, technological design choices and policy decisions.

This course will focus on the Internet and other networks will only marginally be mentioned (mobile networks, local networks, etc.), even if they are converging. Applications of the Internet such as the Web and social networks and other online services are not covered.

Networking History

One of the earliest networks was developed by  Claude Chappe (1763-1805), a mechanical semaphore. In 1837 the electrical telegraph allowed transmission of Morse code. In 1866 there was a connection between London and New York at a price of 20 words for 100$. In 1863 Reiss developed telephony. In 1895 the first wireless communication was demonstrated, in 1906 radio was broadly introduced. Television was broadcast in 1928.

The Internet is based on packet-switching which was first demonstrated by Kleinrock in 1961 (Kleinrock, 1961) . In 1964 Baran created military nets using packet-switching (Baran, 1964) . In 1967 the ARPAnet was conceived and installed in 1969. In 1972 the ARPAnet had 15 nodes. In 1973 Metcalfe proposed Ethernet (Metcalfe & Boggs, 1976) .  Vinton G. Cerf & Robert E. Kanhn’s itnernetworking principles developed in 1974 (Cerf & Kahn, 1974) . In 1979 the ARPAnet has 200 nodes.

The Internet was commercialised in the 1990s. The ARPAnet was decommissioned. The NSFnet in 1991 allows commercial use and is itself shutdown in 1995 and replaced by the World Wide Web (WWW). In the 2000s the dotcom-bubble for the first time shot the impact potential of the Internet on the real world.

Internet Basics

The Internet carries packets. Packets have headers that describe them, a payload which contains their contents. Officially, Internet routers only care about packets. The explicit analogy is like mailing a letter (inside the envelope is the payload/letter and the headers equals the address on the envelope). This differs to telephone traffic where the traffic is analysed to optimise the traffic (fax versus voice call).

IP addresses have 32 bits and therefore can approximately connect 4 billion devices. An IP address has become a scarce resource. The question arises who allocates addreses, who can be reached globally and should a new protocol be adopted? IP version 6 has been proposed as the solution.


I think the IETF hit the right balance with the 128 bits thing. We can fit MAC addresses in a /64 subnet, and the nanobots will only be able to devour half the planet.


A protocol defines a set of messages that are sent between end-points and define what these messages mean and what end-points should do with these messages.The internet protocol stack consists of 5 layers: physical, link, network, transport and application. Throughout this course we will focus on transport and network.

The data send in a message will get an additional header for each layer that it traverses. The Internet has at the core the IP and does not change this (“narrow waist model”). The layers above (transport and application) or below (physical, link) can be arbitrarily changed. Side note: in Germany successfully carry pigeons were used to send a message. The rigidity of IP is claimed to be the reason for the success of the Internet. In reality, their are many more layers, there have been observed real world packets with 12 layers and more where the IP layer is repeated multiple times. HTTP has become the main protocol, and other protocols are often blocked, consequently, much traffic that is not actually text (e.g. video) is send over it.

The Internet consists of many autonomous system (Internet Service Providers (ISP)) that communicate via Border Gateway Protocol (BGP). Each system advertises where it can delivers messages to, however, they need not be truthful. Incidents include advertising optimal routes to everywhere to attract all traffic (including special regions). Another alternative is to advertise a route that is cheap, but never deliver the packet. It is not clear how to resolve such misuse of the system.

The Internet has been designed insulated from commercial and political pressures, but the reality has changed. The idea for the Internet and the real-world use have diverged. The course focuses on the tension between technology, policy, commerce and politics.


Baran, P. (1964). On distributed communications networks. IEEE Transactions on Communications Systems, 12(1), 1–9.
Cerf, V., & Kahn, R. (1974). A protocol for packet network internetworking. IEEE Trans. Commun , 22, 627–641.
Kleinrock, L. (1961). Information flow in large communication nets. RLE Quarterly Progress Report, 1.
Metcalfe, R. M., & Boggs, D. R. (1976). Ethernet: Distributed packet switching for local computer networks. Communications of the ACM, 19(7), 395–404.


CSD: Introduction

The course “Cognition in Studio Design – analytic tools for evidence-based design” will discuss readings of space syntax (Bafna, 2003) , navigation issues (Carlson, Hölscher, Shipley, & Dalton, 2010) as well as functions and applications of spatial cognition (Montello & Raubal, 2013) .

To compute space syntax DepthmapX will be used.


Bafna, S. (2003). Space syntax: A brief introduction to its logic and analytical techniques. Environment and Behavior, 35(1), 17–59.
Carlson, L. A., Hölscher, C., Shipley, T. F., & Dalton, R. C. (2010). Getting lost in buildings. Current Directions in Psychological Science, 19(5), 284–289.
Montello, D. R., & Raubal, M. (2013). Functions and applications of spatial cognition. In Handbook of Spatial Cognition (pp. 249–264). American Psychological Association (APA).


SMADSC: Introduction

Complex systems are the core topic of  Social Modelling, Agent-Based Simulation, and Complexity. Complex systems usually emerge as an artefact of interaction. The output of a complex system follows the Power Law and may have a regime or phase changes, known as tipping points. Emergent properties and scale-free organisation are a typical feature of complex systems. It would it be possible to analyse top-down, but is best studied bottom-up.

In general, a social system is analysed by creating a mental model of it, deriving hypotheses regarding endogenous and exogenous forces that drive it and finally instantiating an agend-based model (ABM) in code that is simulated in silicio.

Recommended reading for the week is Chapter 9 in Complex adaptive
systems: An introduction to computational models of social life (Miller & Page, 2009) and Chapter 8 in Introduction to computational
social science: principles and applications (Cioffi-Revilla, 2013).

Agent-based Models (ABM)

Usually, an object-oriented software system that instantiates a model of living systems of social entities. Agent-based models go beyond numerical analysis, rather they observe emergent behaviour. Broad paradigms that influence ABMs are cellular automata, big data, social networks, and generative models. Concepts will be emergence, bottom-up computation micro-level rules lead to macro-level behaviours. There are two main dominant characteristics of ABMs:

  1.  A positive representation attempts to closely recreate or capture the abstract or detailed essence  of a prototype system.
  2. A normative representation provides input control for exogenous steering of internal feedback loops.

Generative ABMs are useful in three general cases:

  1. Modelling historical systems, that cannot be revisited
  2. Long-lived systems, that span a longer time than can be observed
  3. Unethical, illegal, unsafe or unlikely environmental  settings or exogenous  stimuli to the system

The Game of Life (Conway, 1970)

A game with two states {dead, alive} and the rules:

Each cell checks the Life State of itself and those of the cells in its local neighbourhood at a Moore distance of 1. If alive then display a pixel if dead do not. If this cell has less than two neighbours alive or more than three neighbours alive then, set this cell dead. If there are exactly three alive neighbours, set Life State alive. Randomized activation of cells continues “forever.”

It uses the concepts of cellular automata and either Moore or von Neumann distance as well as distance-neighbourhoods.

Other famous ABMs are Flocking (Reynolds, 1987), Swarming (Bonabeau & Meyer, 2001), Residential Segregation (Schelling, 1969), Residential Segregation using vector-based GIS (Crooks, 2010)


Bonabeau, E., & Meyer, C. (2001). Swarm intelligence. Harvard Business Review, 79(5), 106–114.
Cioffi-Revilla, C. (2013). Introduction to computational social science: principles and applications. Springer Science & Business Media.
Conway, J. (1970). The game of life. Scientific American, 223(4), 4.
Crooks, A. T. (2010). Constructing and implementing an agent-based model of residential segregation through vector GIS. International Journal of Geographical Information Science, 24(5), 661–675.
Miller, J. H., & Page, S. E. (2009). Complex adaptive systems: An introduction to computational models of social life. Princeton University Press.
Reynolds, C. W. (1987). Flocks, herds and schools: A distributed behavioral model. ACM SIGGRAPH Computer Graphics, 21(4), 25–34.
Schelling, T. C. (1969). Models of segregation. The American Economic Review, 59(2), 488–493.


ISN: What are Social Networks?

Social networks are based on relations between two or a few individuals from friendships over contracts to work contacts.

Throughout the course, the theory behind social networks will be put into context with methods of comparing and applying social networks. Examples from different scientific disciplines will be used to illustrate the social networks.

Network descriptives

Mathematical descriptions of networks are a useful descriptive. An adjacency matrix can be used to represent a graph as nodes and edges.

Networks can be analysed on different levels:

  • Dyad level (O(n^2)) or connections between nodes
  • Node level ((O(n)) or properties of nodes
  • Network level ((O(1)) or clustering of nodes.

Centrality could be access to resources, connection between parts, part of interaction. For a detailed report on centrality measures, look at this post in my Complexity and Global Systems Sciences lecture notes. Centrality measures often differ and in larger networks will be different for different measures. The choice of centrality is dependent on the research question.


Generally, for any network, one should start with the following descriptives, before continuing to more advanced analysis.

  1. Start with a visualisation of a network.
  2. Compute density of network (number of edges divided by maximal number of edges; note that the maximal number is different for directed ( e_{max} = n(n-1)) and undirected (e_{max} = n(n-1)/2) graphs).
  3. Measure centrality in social networks.


PE: Institutions and Economic principles

The main reference for today will be Mueller’s Public Choice III Chapter 1 and 2 (Mueller, 2003)as well as Acemoglu’s Political Economy Lecture Notes Chapter 1 (Acemoglu, 2009). Additional readings are Acemoglu’s Chapter 2 and work by Ostrom (Ostrom, 1998) and Schnellenbach (Schnellenbach & Schubert, 2015).

Political Economy joins the fields of Political Science and Economics. To illustrate the case, we can consider the Trump administration: Politics affects the economy and the choice of policy areas benefits some sectors or others. So far, the stock markets reacted positively to the Trump administration in expectation of reduced “red tape”. But upon closer inspection specific sectors benefit whereas others linger or decline. The political decisions will influence which sectors flourish.


As always, the term is widely different defined. We will rely on the definition of “rules of the game” provided by Acemoglu (p.5ff). This includes political institutions (constitution electoral rules, separation of powers, chacks and balances, etc.) and economic institutions (property rights, commercial law, contract law, etc.). It also differs between formal (de jure) and informal (de facto) institutions.

Four different views on institutions compete according to Acemoglu (in no particular order):

  1. Efficient institutions view: maximising total surplus or compensation (Coase theorem), should have no impact on output (not proven empirically) and  troubled by commitment problems (imperfect contracts about compensation).
  2. Social Conflict view: Institutions are chooses by political power (rent maximisation).
  3. Ideology/belief view: Different view on what is best for society.
  4. Incidental institutions view: By-product of other social interactions (taxation implies representation and ultimately leads to parliamentary representation).

This views on their own are often not 100% explanatory. Inefficient institutions can be explained by hold-ups (current elites usually cannot respect future elites), political losers (parliaments usually cannot shrink) or economic losers (reforms benefit some more than others, which may have political power). Consequences from these limitations are 1) that constraints on political power and broad distribution of political power make secure property rights more likely, 2) stable economic institutions are more likely if rents are limited and 3) institutional reforms are more likely to be successful if they do not threaten incumbents.

Economic Principles

Homo Oeconomicus assumes rationality and utility maximisation, but is undermined by findings in Behavioural Economic (Schnellenbach & Schubert, 2015).

Trade and markets are usually a good way to organise economic activity.

Perfect competition/markets assumes well-defined property rights, large number of buyers and sellers, perfect information, homogeneous products, no barriers to entry and exit, all participants are price-takers (no market power), participants are homo oeconomici (and firms are profit-maximising), no externalities, no transaction costs and a constant returns to scale. In other words, a pareto-optimal allocation.

Pareto-efficient/optimal implies that no improvement can be made for some people without making anyone else worse off.

Kaldor-Hicks-optimality is a relaxation of the Pareto-optimality, where the benefiting side hypothetically could compensate the worse-off side.

Market failures occur with public goods, externalities, monopolies, unequal distribution of resources, etc.

Types of Goods:

RivalrousPrivate Goods
clothing, car
Common-pool resources
fish stock, timber, coal
Non-rivalrousClub goods
cinemas, satellite TV
Public goods
national defense
The four types of economic goods.

The Freerider Problem arises with public goods where an individual does not have to contribute to receive the public good, but consequently the public good is underprovided.

Externalities are unintended impacts on another individual or firm. The social costs is not covered by the private cost. Pareto-optimality can be achieved by a Pigouvian tax or Coase bargaining.

The Coase Theorem states that if an externality is tradable and has sufficiently low transaction costs, bargaining between involved parties will lead to a pareto-optimal solution regardless of initial allocation of property. However, assignment of property rights and large numbers of individuals undermine the Coase Theorem.

The Tragedy of the Commons is that the individual, ration self-interest is contrary to the common good.

The n-person social dilemma (Ostrom, 1998) stipulates that the more persons cooperate the higher the benefit, but for a given number of cooperating players, a single defecting player will benefit more. This induces non-cooperation that lowers the overall benefit achievable. To overcome non-cooperation, social conventions can be applied from adaptive learning over reputation to social sanctions (in contrast to the homo oeconomicus).

In a Game of Chicken the best outcome would be to cooperate, but for each individual the outcome is better to defect, the ultimate state is the worst for all.

(G,D)Contributes to fence-buildingDoes not contribute
Contributes to fence-building(3,3)(2,3.5)
Does not contribute(3.5,2)(1,1)
The Game Theory discrition of the Chicken game

Conceptions of the state

Often states are normatively justified: from survival (better than the state of natural anarchy) over efficiency (e.g. distribution of public goods) to equity (e.g. social fairness).

According to Acemoglu (Acemoglu, 2009) the state is often conceptualised as:

  1. State without agency: no interests of its own, rectifies market failures
  2. State as nexus of cooperation: Hobbesian/Rousseau’s view of the state (as compared to anarchy)
  3. State as agent of a social group: Capitalists, financial sector, ethnic group, men, etc.
  4. State as grabbing hand: Members of the state look after their own interest
  5. State as autonomous bureaucracy: Represents interests beyond their members interst


Acemoglu, D. (2009). Political Economy Lecture Notes. Retrieved from
Mueller, D. C. (2003). Public Choice III (3rd ed.). Cambridge, UK: Cambridge University Press.
Ostrom, E. (1998). A Behavioral Approach to the Rational Choice Theory of Collective Action. American Political Science Review, 92(1), 1–22.
Schnellenbach, J., & Schubert, C. (2015). Behavioral Political Economy: A Survey. European Journal of Political Economy, 40, 395–417.


Urban Design I: Tools

Throughout the course Urban Design I several “tools” were introduced that impact urbanity.


Tools of this kind belong to top-down approaches and usually give form to the urbanscape in a radical way.

Megascale-planing (Berlin)

Berlin was an early example of a politically motivated re-organisation of administrative units. Berlin grew from nearly 2 million to 4 million people due to the administrative rearrangement. Infrastructure was created to join the adjacent cities and towns.

Horizontal-vertical grid (New York)

To tackle shanty towns and issues with hygiene in 1811 New York proposed the grid layout of the city. The grid was super-imposed over the old city and only main roads like the Broadway give a glimpse at previous layouts. The grid-structure was complemented in the early 20th century with vertical zoning laws that created the concept of high rises with private plazas that must include residential areas in the buildings.. To compensate for the high density the Central Park was created as a contrasting void.


This set of tools focuses on cities with destroyed urban fabric and potential ways of reconstructing the fabric.

Critical Reconstruction (Berlin)

The “Planwerk Innenstadt Berlin” was a combined effort to fill the holes in the city left by the division and the war. The main idea was to re-discover the historic character of the city and modernise it.

De-urbanisation (Sarajevo)

The urban fabric of the city was not only damaged by the on-going war within the city, but also intentionally destroyed to remove signs of urban co-existence of different ethnic groups. The process has been called urbicide and is intrinsically connected to ethnic cleansing. The use of urban space was shifted and attained new meaning. Open spaces became dangerous due to the constant sniper fire and new spaces had to be acquired. The cold winters forced people to cut all trees for firewood. The city’s transformation was consequently two-fold: enforced by destruction and new uses of the remainders.

Shrinking City (Detroit)

In a shrinking city the core loses its role and the periphery becomes dominant. It is often accompanied by generating suburbia. The decreasing role reduces services provided by the city and requires a drastic rearrangement of budgeting. It is often an ignored reality that is only considered when all potential alternatives failed. See the bankruptcy of Detroit.

Micro/Temporary programmes

This set of tools is limited in either time or space. It focuses on action-driven approaches where either events in the near past triggered the programme or the programme is an answer to an issue of missing urban functionality.

Temporary Urbanism (Berlin)

The empty/negative spaces of Berlin offer space for temporary and spontaneous use. Temporary urbanism arises as a consequence. An example would be the “Kitchen Monument”. A mobile kitchen that is temporarily installed in empty spaces throughout Berlin.

Turbo Urbanism (Sarajevo)

The negative spaces created by the war as well as the urbicide created the need for many functions in the city that were currently not fulfilled. New architectural and urban interventions materialised and transformed economic identities, accompanied by gentrification.

User-generated Urbanism (Athens)

Small scale, user-generated, architectural solutions to urban problems such as self-managed parks, occupation/squatting movements and alternative economy networks. They include new programs for meetings and open assemblies, and new models of production, such as the formation of urban plantations.

Cooperation and Dialogue (Cape Town)

Following the apartheid segregation and the distributive attempts of the 1990s a new paradigm was introduced in the early 2000s. To contain the sprawl, amenity requirements were introduced (access to public infrastructure, social and economic facilities). Instead of attempting total redistribution also intermediate solutions such as upgrades to the infrastructure of informal settlements rather than complete rebuilding were added to the policy portfolio. Local needs were examined and localised solutions actively sought for.

Street Renaissance (New York)

The lack of funding in the Department of Transportation, forced New York to become creative on how to adapt to new urban realities. Street paint was used to reduce space for cars, broaden side walks and introduce bicycle lanes. The reclaimed space was then occupied by pedestrians, restaurants and street furniture. Urban re-engineering enabled a fluid transformation of New York.

Microplanning (São Paulo)

Transforming unused micro-spaces into highly functional pieces of a city. For instance, the Garrido Boxing Gym situated under the unused space below an elevated highway. It uses the urban morphology to and can be considered a micro-intervention that enables local residents to participant in sports. Benches, skate parks, mini lawns and planters can all be considered micro-interventions to improve unused public space.

Active Infill (Detroit)

A reaction to empty decaying space in a city. Can be either tackled top-down with infrastructure restructuring or bottom-up in community driven projects. Infrastructure restructuring includes the condensing of services (thereby dis-servicing certain areas effectively shrinking the city) and offering incentives in “condensing areas”. Community projects make use of the empty space and give new meaning to the local urban fabric.

Informal/Hybrid City

These tools focuses on actors at the border of formal and informal and highlight how both interact and even how they can be combined.

Reactivating the city (Sarajevo)

Political paralysis has caused neglect and destruction due to disagreement over how to proceed. This opened up the city as a new urban frontier. For instance, the Historical Museum of Bosnia and Herzegovina epitomises the phenomenon. To overcome decay due to budget cuts supports of the museum suggest to stretch transparent vinyl over scaffolding to stop water from causing further decay. The museum is tasked with cultural  preservation and opens a venue for society to deal with its traumatic past.

Hybrid City (Caracas)

An interplay of formal and informal settlements characterises housing settlements such as “23 de Enero”. The formal housing structures have been “improved” with informal settlements around them to optimise the use of space and to accommodate social and economic functions (such as shops and restaurant) required by the inhabitants.

Repurposing infrastructure (New York)

Rail viaducts were a common necessity in the early 20th century. In the 1950s trucks displaced trains and robbed the elevated tracks of their purpose. The High Line showcases the repurposing of infrastructure. The viaduct became a linear park that offers a green space through which people can move around the city.

Public Infrastructure/Mobility

These tools demonstrate the interconnectedness between mobility and urbanity and highlight the interaction (both negative and positive).

Oil and Automobile City (Caracas)

A car-centric approach to public infrastructure focuses on freeways and elevated highways that partition the city and thereby segregate it.

Multiple Hubs (Caracas)

The inaccessibility of Slums such as San Augustin up in the mountain hills requires new approaches to urban mobility. Cable cars were introduced and transformed the urban landscape. Not only did they provide transport to the residents, they offered a functional space for formal services (postal, banking, government) in an otherwise informal environment. The overlay of functionalities popularised the method throughout slums of Latin America.

Urban Mobility (São Paulo)

The vector of mobility defines the kind of urban space that will be created. São Paulo showcases a steady move away from public transport towards individual transport reducing public space and decreasing traffic flow.


These tools take influence on an abstract but fundamental level. Often they set the rules of the game and indirectly enforce specific outcomes.

Developer as Architect (Athens)

The deregulation of the construction industry with regulation of building structure effectively removed the architect form the equation. Polykatoikia were constructed throughout Athens without a masterplan. An abstract legislative framework enforced the practise of self-building.

Masterplanning Segregation (Cape Town)

Apartheid planning consisted of deliberately developing the city based on ethnic segregation. Planning was completely top-down and racially motivated and permeated through political and administrative processes.


These tools tackle how to embed humans in the urban fabric and showcase different approaches to creating urbanity.

Post-olympic Urbanism (Athens)

Olympic games are considered a potential urban development catalyst. They can intervene in the short- and long-term development activities. Additionally, they require urban functions that may have been previously lacking. If applied correctly, they can be used to address urban issues (such as inner-city decline or sprawl). However, Athens is a prime example of how to not do it as today most of the Olympic facilities are decaying.

Development through Distribution (Cape Town)

To overcome inequalities distribution policies can be enacted that equalise the urban realm. In Cape Town social housing and Mandela’s promise of “one house per family” drove the creation of new houses. However, the development happened within the geographical constraints set out by the previous apartheid regime and consequently reinforced social segregation. Additionally, due to the required space of a house, the city began to sprawl diffusing the urban core.

Community Projects

This tool focusses on urban functionalities on the community level.

(Infra)Cultural Design (São Paulo)

To strengthen urban fabric Unified Educational Centres (CEU, Centro Educational Unificado) were placed strategically in diffuse urban locations. They offer new socio-cultural opportunities and enable communities to express themselves. Identities can be formed around those hubs and enable the city to become more coherent.


This tool focusses on the interaction between suburban and urban.

Generating Suburbia (Detroit)

To accommodate single houses per family suburban developments were created that rely on roads, cars and telephones to cover distance. The space requirements increases the surface area of a city unproportionally and requires large infrastructure spending to maintain roads. In the case of Detroit it has aggravating side-effects. People moved out of the administrative boundaries of the city into suburbia and reduced the tax base of the city accelerating its decline. The overstretching thins the urban fabric and distributes and diffuses it.