Friday 29 June 2018
A constitution for a confederate Europe - beyond the nation state
The traditional nation state is not enough to ensure prosperity, security and freedom in a world of multi-national interests. The solution is not to grow larger nation states but to move beyond them to associations or confederations where there is a consensus on core values.
Getting this right poses considerable constitutional challenges. The European Union has developed into an unintentional experiment in this process. There is, however lack of clarity on the principles and values guiding the process. The Prometheus Institut has proposed a Manifest for a confederate Europe. In this document it argues for a confederation of constitutionally liberal states.
The core principles of a future EU constitution should be the common market for goods and services and free trade with the rest of the world. The confederate Europe will require a liberal monetary policy that does not divide Europe into different classes within a rigid framework of a common currency. Additionally, states that will not, in the foreseeable future, join the EMU must have a place in the European within a flexible community.
Within Manifest for a confederate Europe there are points to debate but it is strongly focused on EMU concerns and not all the items will find a consensus but it frames the debate on the future of the EU as it should be. That is as how to a achieve the extension of constitutional liberalism beyond the nation state.
Privacy: your data
At the Liberal Democrats Autumn conference I helped organise an ALDES (Association of Liberal Democrat Engineers and Scientists) discussion session on privacy and security titled "Your data, your choice" . The motivation for the session was that:
Government, business and our personal lives are increasingly driven by our personal data. Credit card transactions, location data, and health records have the potential to improve products, provide insight for policy making, and detect security threats. But they also challenge our notions of privacy, intimacy and autonomy. How can results from privacy research be translated into policy?
The session was chaired by Richard Gomer, a researcher in Meaningful Consent. The panellists brought expertise from the areas of publishing, computer science, artificial intelligent and security
There are steps individuals, as well as local and national governments, could take to protect privacy through regulation and law on the use personal data, education and digital rights. The intent is to pursue this topic with Liberal Reform and the Liberal Democrat Lawyers Association.
An important aspect of the discussion is the impact of algorithms with access to very large data sets of personal information. This was the topic addressed at the Royal Society discussion over the 2 day (30-31 October): "The growing ubiquity of algorithms in society: implications, impacts and innovations".
This session was almost dominated by legal considerations but served to underline the challenges to regulation and the rule of law that arise due to technical advance due to the availability of personal data, commutation power and algorithmic innovation. Technology may also provide a defence, especially for those aspect of privacy that are close to data security. The formalisation of secrecy following the seminal work of Claude Shannon during WWII allows a definition of an aspect of privacy to be formalised in a mathematically precise sense. This can be implemented to give some guarantee that the use of your personal data does not impact (or bounds the impact on) your privacy.
Government, business and our personal lives are increasingly driven by our personal data. Credit card transactions, location data, and health records have the potential to improve products, provide insight for policy making, and detect security threats. But they also challenge our notions of privacy, intimacy and autonomy. How can results from privacy research be translated into policy?
The session was chaired by Richard Gomer, a researcher in Meaningful Consent. The panellists brought expertise from the areas of publishing, computer science, artificial intelligent and security
- Yves-Alexandre de Montjoye (Data Scientist and privacy researcher at Imperial College London)
- Luc Moreau (Professor of Computer Science at King’s College London)
- Yogesh Patel (Chief Scientist at Callsign)
- Leonie Mueck (Division Editor at PLOS ONE.)
There are steps individuals, as well as local and national governments, could take to protect privacy through regulation and law on the use personal data, education and digital rights. The intent is to pursue this topic with Liberal Reform and the Liberal Democrat Lawyers Association.
An important aspect of the discussion is the impact of algorithms with access to very large data sets of personal information. This was the topic addressed at the Royal Society discussion over the 2 day (30-31 October): "The growing ubiquity of algorithms in society: implications, impacts and innovations".
This session was almost dominated by legal considerations but served to underline the challenges to regulation and the rule of law that arise due to technical advance due to the availability of personal data, commutation power and algorithmic innovation. Technology may also provide a defence, especially for those aspect of privacy that are close to data security. The formalisation of secrecy following the seminal work of Claude Shannon during WWII allows a definition of an aspect of privacy to be formalised in a mathematically precise sense. This can be implemented to give some guarantee that the use of your personal data does not impact (or bounds the impact on) your privacy.
Friday 16 September 2016
Growth of knowledge and critical rationalism
An evolutionary model of knowledge growth is proposed which overcomes the weaknesses of models based on justified true believe.
A threefold classification of useful knowledge is provided as:
It will be taken as self evident that a body of knowledge cannot be established with absolute certainty but arguments can be presented to support it and it can be criticised. However it is extremely important not to embark on a futile search for ultimate justification and certainty that would lead to a frozen body of eternal truths. Knowledge growth is an evolutionary process. Knowledge is not something which can be stock-piled like gold bars, or mature in a barrel like wine – it is speculative, fallible and volatile. The evolutionary explanation of knowledge growth has been concisely described by Popper (Conjectures and refutations. Routledge and Kegan Paul, London, 5th edition, 1974.).
The way in which knowledge progresses, and especially our scientific knowledge, is by un-justified (and unjustifiable) anticipations, by guesses, by tentative solutions to our problems, by conjectures. These conjectures are controlled by criticism; that is, by attempted refutations that include severely critical tests. The conjectures may survive these tests; but they can never be positively justified: they can neither be established as certainly true nor as ’probable’ (in the sense of probability calculus). Criticism of our conjectures is of decisive importance: by bringing out our mistakes it makes us understand the difficulties of the problem which we are trying to solve. This is how we become better acquainted with our problems, and able to propose more mature solutions: the very refutation of a theory–that is, of any serious tentative solution to our problem–is always a step forward that takes us nearer to the truth. And this is how we can learn from our mistakes.
The process of testing and, trail and error provides the knowledge obtained in this way with an objective status and practical reliability that cannot be obtained by firmly held belief. But not infallibility and therefore in practical application alertness to the possible failure or need for correction to the knowledge must be maintained. This gives rise to a knowledge growth life-cycle model shown in the figure above. In this model the cycle starts with the recognition of a problem or a challenge that needs a solution. For example this can come about through contradictions or incompatibilities in the pool of knowledge. A current clear example of this is the incompatibility of General Relativity and Quantum Mechanics.
There is no mechanistic process for creativity but it is mandated that creative outputs should be testable in principal. Testability is a logical requirement on the formulation of a theory or an explanation because If a solution is not testable then it is independent of what is the case in the world and therefore irrelevant to explaining what is the case. The reason why only testability in principal should be mandated is that a new explanation or solution can itself suggest new measurement or test methods and therefore provides a new problem in how to realise the test. This can be done because testability is part of the logical structure of an explanation and does not require specific tests to be put forward initially.
A threefold classification of useful knowledge is provided as:
- Capability
- Insight
- Objective knowledge.
It will be taken as self evident that a body of knowledge cannot be established with absolute certainty but arguments can be presented to support it and it can be criticised. However it is extremely important not to embark on a futile search for ultimate justification and certainty that would lead to a frozen body of eternal truths. Knowledge growth is an evolutionary process. Knowledge is not something which can be stock-piled like gold bars, or mature in a barrel like wine – it is speculative, fallible and volatile. The evolutionary explanation of knowledge growth has been concisely described by Popper (Conjectures and refutations. Routledge and Kegan Paul, London, 5th edition, 1974.).
The way in which knowledge progresses, and especially our scientific knowledge, is by un-justified (and unjustifiable) anticipations, by guesses, by tentative solutions to our problems, by conjectures. These conjectures are controlled by criticism; that is, by attempted refutations that include severely critical tests. The conjectures may survive these tests; but they can never be positively justified: they can neither be established as certainly true nor as ’probable’ (in the sense of probability calculus). Criticism of our conjectures is of decisive importance: by bringing out our mistakes it makes us understand the difficulties of the problem which we are trying to solve. This is how we become better acquainted with our problems, and able to propose more mature solutions: the very refutation of a theory–that is, of any serious tentative solution to our problem–is always a step forward that takes us nearer to the truth. And this is how we can learn from our mistakes.
The process of testing and, trail and error provides the knowledge obtained in this way with an objective status and practical reliability that cannot be obtained by firmly held belief. But not infallibility and therefore in practical application alertness to the possible failure or need for correction to the knowledge must be maintained. This gives rise to a knowledge growth life-cycle model shown in the figure above. In this model the cycle starts with the recognition of a problem or a challenge that needs a solution. For example this can come about through contradictions or incompatibilities in the pool of knowledge. A current clear example of this is the incompatibility of General Relativity and Quantum Mechanics.
There is no mechanistic process for creativity but it is mandated that creative outputs should be testable in principal. Testability is a logical requirement on the formulation of a theory or an explanation because If a solution is not testable then it is independent of what is the case in the world and therefore irrelevant to explaining what is the case. The reason why only testability in principal should be mandated is that a new explanation or solution can itself suggest new measurement or test methods and therefore provides a new problem in how to realise the test. This can be done because testability is part of the logical structure of an explanation and does not require specific tests to be put forward initially.
Subscribe to:
Posts (Atom)