Top 5 by Venus in Arms – week 46. “The best of Venus (2014-2015)”

Quite a different Top 5 this week. One year ago our “adventure” in the blogsphere started. In the last 12 months Venus in Arms tried to provide a contribution to the current debate on defense and security, from an Italian and European perspective. It has been a hard work, but we are really satisfied about the results. And we are eager to improve our work every day. So, this week we present the best posts (according to our opinion..) published by Venus in Arms in the last year (March 2014 – March 2015). We deeply thank ALL the people who supported us with brilliant guest posts.

Share Button

2015: a look ahead into development cooperation and transparency

Venus in Arms has already focused on the complex relationship between security and development (especially from a European perspective). Here below a guest post by Francesca Fondi* on development cooperation and transparency. 2015 will be a crucial year.

 

With 2014 officially over, 2015 already profiles itself as a key year for development cooperation. In particular, challenges are ahead for global, European and Italian institutions in order to fulfil international commitments and grant appropriate space and efforts to a transparent delivery of aid.

Firstly, 2015 is the year of the ultimate definition of the so-called post-2015 agenda, whereby the new strategic goals are set to fight poverty and ensure sustainable development on a global scale. The final deadline for the accomplishment of the Millennium Development Goals (MDGs – eight time-bound targets set in 2000 during the Millennium Summit) is in fact approaching: whereas the MDGs have been effective for substantial resources mobilisation, some of the eight objectives are still far from being reached. The period following December 2015 will therefore require the finalisation of further priorities and targets for the world long-term development agenda (SDG – Sustainable Development Goals). The UNDP and the OECD are among the organisations leading the process.

Moreover, in a period when a number of OECD DAC members are decreasing their funds for development assistance, some of the basic concepts at stake are also being re-defined, such as the one of ODA (Official Development Assistance). Discussions are ongoing among international development stakeholders to broaden the definition of development assistance to more global development efforts and to include spending on security and stabilisation. The ‘new’ ODA would therefore encompass a wider range of activities, aimed at highlighting the link between development and security, in particular at times when instability and humanitarian crises are increasingly growing on a global scale.

2015 is also the final year for the donor community to align to the commitments made in Busan, Korea, in December 2011 during the 4th High Level Forum on Development Effectiveness. As outcome of the meeting, all major bilateral and multilateral donor organisations signed up to the Busan declaration, thus committing, among the others, to “implement a common, open standard for electronic publication of timely, comprehensive and forward-looking information on resources provided through development cooperation”.

Such statement entails the alignment to the common standard for aid transparency, made up by the ‘traditional’ OECD DAC reporting modules (the Creditor Reporting System – CRS++ and the Forward Spending Survey – FSS), as well as the IATI (International Aid Transparency Initiative) standard, already set up and agreed upon during the 3rd HLF in Accra in 2008.

Since 2012, multilateral and bilateral development organisations have therefore been struggling quite extensively to ensure that their organisation progressively implements such commitment, which is, above all, a political one. However, the results achieved up to now are not always satisfactory.

From one side, the common standard implementation is currently monitored by the Global Partnership for Effective Development Cooperation report, in particular through the Transparency Indicator, a composite indicator that assesses the three core features of the common standard: timeliness, coverage and forward looking information. From the other side, the alignment to the IATI standard is closely followed by the watchdog NGO Publish What You Fund, that every year, since 2011, assesses (68) donors’ performance through its Aid Transparency Index (ATI).

Not surprisingly, according to the 2014 ATI, the best pupils are the Nordic cooperation (for instance SIDA, DfID and the Netherlands), as well as some of the major multilateral donors and development banks such as UNDP and the World Bank. The European Commission, the biggest ODA provider worldwide, has been one of the organisations that most significantly improved their performance in the publication of data on foreign aid in the last couple of years. All services ranking among the top 15 in the 2014 ATI, DG Development and Cooperation (DEVCO) has been the frontrunner in the implementation of the IATI standard, shortly followed by DG Enlargement, Foreign Policy Instruments (FPI) and ECHO.

Within the framework highlighted above, a progress towards an increased transparency of financial assistance to third countries appears even more relevant and appropriate.

However, the significant efforts needed to adapt to a standard that, inevitably, cannot reflect the working processes and structures of such different organisations, need to be compensated not only by an increased transparency of external assistance towards the recipient countries and individual taxpayers, but also by internal benefits coming from a use of the published information, notably for policy making and evaluation of assistance programmes.

Finally, 2015 has also been made the European Year for Development: as per the communication by the European Commission, it will represent a key opportunity to increase knowledge and discourses on development across the continent. In times not only of economic crisis, but also of crisis of global solidarity values, Europe seems indeed in the need for messages and a renewed consciousness in this respect. The European campaign therefore intends to raise awareness within the wider public on the importance of the development agenda in Europe as well as worldwide, targeting in particular the ‘new’ Member States.

Within this development framework, the Italian Development Cooperation has started its engagement towards a more transparent provision of information on its aid and also more in line with the international standards mentioned above. A recent observer to the IATI Steering Committee meetings, it has created an open portal that provides data on the financial assistance to third countries. Based on the OECD CRS reporting system, the portal collects information mainly from the Italian Ministry of Foreign Affairs, local administrations and the Ministry of Finance; it is planned to further extend the data coverage also to private as well as non governmental funding sources in order to provide a complete picture of the global contribution of Italy in terms of development assistance.

However, efforts are still to be foreseen for a full or partial alignment to the Busan transparency commitments (Italy is still not publishing aid data on the IATI Registry and it is ranked among the bad performers in the 2014 ATI).

In any case, initiatives like the above need to be praised for their relevance in terms of efforts to align to global standards, as well as to make aid spending more transparent towards citizens and development organisations.

Finally, endeavours to create a global standardised information system on development assistance, well beyond the definition of ODA and of the list of OECD DAC members, are in line with the current discourse on the so-called data revolution, a UN-led initiative aimed at enhancing the collection and use of data in the development context. Although at times criticised for its intellectual sloppiness, the importance of accessing and being able to retrieve reliable data on which policy analyses and decisions can be taken is not debatable.

Indeed, this is going to be one of the major challenges for 2015.

 

 

 

 

* Francesca Fondi has been dealing with the Balkans and development cooperation since 2006, working firstly in the field and currently at headquarter level from Brussels. She holds a PhD from IMT Institute for Advanced Studies Lucca, with a dissertation on higher education in Albania.  

Share Button

Intelligence in the Knowledge Society – Report from the International Conference

Guest Post by Davide Barbieri*

Last week on Friday 17th and Saturday 18th October, the annual International Conference on Intelligence in the Knowledge Society took place in Bucharest. A large number of participants and auditors met at the premises of the National Intelligence Academy, coming from many European and neighboring countries, plus Canada and the US. Italy was represented, beside myself as a researcher, by some members of our national intelligence and of the European External Action Service.

The plenary session was opened by the deputy director of the Romanian Intelligence Service (SRI), highlighting the strategic importance of the Wider Black Sea Area for European security. The current international situation, in Eastern Europe and the Middle East, was at the heart of the conference. In particular, a convincing parallel between Crimea and Transnistria was made by a Romanian researcher, and endorsed by the Ukrainian and Moldovan representatives.

Elaine Pressman, from Canada, expanded on her VERA (Violent Extremist Risk Assessment) system in order to apply it to insider threat. VERA has already attracted the interests of several agencies throughout the world. Its Bayesian-like framework, stemming from behavioral sciences, hints to the possibility of a more mathematical approach, which is now lacking (also because of the well known preference for qualitative analysis by intelligence analysts).

Cosmin Dugan, a medical doctor from Bucharest, gave an extremely interesting presentation on the possibilities and strategies for neuro-cognitive enhancement of intelligence practitioners. His multidisciplinary perspective spanned from philosophy and technology (neuro-feedback, neuro-gaming) to medicine. In his opinion, trans-cranial stimulation, complex reasoning training, dietary supplements and drugs can be effectively adopted for neuro-cognitive enhancement, taking into account the possibility of side effects.

I opened the workshop session on Saturday, proposing a data-driven approach to identifying terrorists, based on the assumption that intelligence agencies have collected large amounts of data on persons of interest. The suggested method bears resemblance to medical diagnostics. The similarity between this discipline and intelligence analysis has been suggested also by Prof. Sebe, from the University of Bucharest. Spotting a terrorist may in fact be akin to spotting a rare disease within a large sample of individuals.

The other workshop focused on the opportunity for bridging government, competitive and business intelligence, especially in public administration, in order to increase efficiency, assess performances and gain competitiveness.

Ideas coming from cybernetics, complex adaptive systems, big data analysis and machine learning were presented by several researchers of the Romanian Intelligence Academy, also in the poster session. In particular, text mining was applied to OSINT and web intelligence. Much attention was given to radicalization, particularly in the Islamic world. Afascinating study focused on Dabiq, the Islamic State magazine.

The excellent organization and the participation of many internationally recognized lecturers were well worth the journey and the time. As a side note – notwithstanding that Italy contributed with just one paper and the scant consideration we tend to have of our scientific production – Italian authors were extensively cited throughout the conference (Sartori, Negri and Calvino to name a few), beside Italian technology (in particular, the widely successful Cogito web engine for semantic intelligence by Expert Systems). This year XX edition of the conference proved to be a successful anniversary.

 

* Davide Barbieri, PhD, is a Research Fellow  in the Department of Biomedical Sciences and Surgical Specialties at the University of Ferrara, davide.barbieri@unife.it

 

Share Button

Top 5 by Venus in Arms – week 23

As news of the first case of Ebola outbreak on American soil (in Dallas) spread, some experts are evaluating if Ebola can be weaponized and used as a WMD. It seems that the recent discovery of files related to biological weapons development in an Islamic State’s recovered laptop brought attention back to WMD as a tool that can fall into the hands of “terror”.

Re-emerging threats might lead to further expansion of surveillance by security agencies. Activist and software geek Brad Templeton talks on a BigThink video interview about the NSA’s attempt to access to so-called quantum computer technology, which might expand the agency’s ability to break cryptography and manage big data.

Focus on the Middle East sometimes distracts from events happening elsewhere. “Close encounters” between Chinese and US aircrafts in the South China Sea are small but important hints that international politics is on the move in the Pacific.

Defense industry is always on the move, even with budget cuts affecting the several armed forces and leading to downsizing of programs (Italy to begin with). Still, the US military at least is always tuned to exploit new technologies, such as 3D printing, to improve the effectiveness and/or efficiency of some of its processes and programs.

Western prisoners in the hands of IS, not to forget other cases where kidnapping is a consolidated strategy for armed groups, are everyday news. Jumping back in history, it is insightful to read about the experience of the longest held American prisoner of war, John Downey, recluded in a Chinese prison for two decades after the Korean War.

 

 

 

 

Share Button

Big data: An epistemological revolution?

Guest Post by Davide Barbieri*

During the last decades, large organizations – like multinational companies, governments, hospitals, public administrations, law enforcement agencies and the like – have accumulated huge quantities of data, often in the form of unstructured spreadsheets, emails and text documents. SCADA (Supervisory Control and Data Acquisition) systems have automatically collected production data from sensors and machines, while cheap storage devices, like terabyte-large hard disks, have reduced the necessity to filter the acquired data in advance, according to statistical criteria. The Internet has augmented the order of magnitude of the phenomenon. The mesh-like topology of the network allows to quickly and efficiently communicate and spread the data, or to store them in cloud-computing facilities. Opinions and comments can be collected from users in blogs or news sites. We may assume that structured data, stored in corporate databases, account for the minor part of the big data explosion. As a consequence, it is not so obvious to extract from those huge data repositories the meaningful bits, in order to distinguish reliable information (or intelligence) from noise.

Information technology allows to select and aggregate information from databases by means of query languages (like SQL, Structured Query Language, for instance). Such languages implement the most common statistical functions, like mean, range and standard deviation. In case queries were not enough, scientific software packages allow skilled users to perform more advanced statistical analyses. Still, since the end of the 90s of the last century, a new set of technologies has emerged, collectively known as data mining, which allows to extract meaningful – but often unpredicted or counterintuitive – knowledge from large datasets. As technology progresses, engineers, statisticians and mathematicians must face a new epistemological challenge: Is the information processed by means of data mining reliable? Can it be considered scientific knowledge? Beside the fact that even the possibility of answering such questions is arguable, since the definition of science is not obvious, we can try to shed some light on them sifting through the history of scientific thought.

Starting from the XVI-XVII century, the progress of modern science (during the Scientific Revolution) has been supported by the collection of empirical data, that is observable and measurable facts. This epistemic premise cannot deny the existence of non-observable reality, that is metaphysics, but denies that it can be investigated scientifically. Even if it can be questioned whether empirical science appeared first and it was then followed by epistemology or vice-versa, the philosophers who first tried to elaborate a scientific method in a formal (i.e. logically consistent) way were empiricists, like Francis Bacon and David Hume, affirming the main role of the inductive method in the production of scientific knowledge and denying the validity of a priori knowledge (Descartes can be considered one notable exception, more on the rationalistic side).

Even if it has become a wider concept in contemporary times, classic induction – as defined by Greek philosophers using the term epagoghè  – essentially consists in the mental process of inferring a general conclusion from (possibly many) particular observations. Still, however large the amount of evidence, the conclusion can never considered to be certain. In fact, a single observation is sufficient to reject it. For example, no matter how many black crows we observe, the conclusion that all crows are black can be proved to be wrong by a single non-black crow. Therefore conclusions obtained by means of induction are only likely. Definitely, they cannot be considered universal (always true), but only contingent, leaving some room for inaccuracy and uncertainty, and therefore they are a bit generic.

For this and other reasons, during the ancient and middle ages, deduction – as formalized by Aristotle – was preferred. The Aristotelian method infers necessary conclusions from general premises, like in the famous syllogism: All men are mortal and Socrates is a man, therefore Socrates is mortal. Still, the fact that deduction is formally or logically correct does not guarantee that the conclusion is true: All animals fly, donkeys are animals, therefore donkeys fly. The syllogism is correct, but the conclusion is false, since the premises are false. A single non-flying animal (a donkey or a dog, whatever) can prove the main premise (all animals fly) to be wrong. General premises should therefore be considered hypotheses, so that a syllogism will assume the following form: If A then B, as in computer logic.

Therefore, neither inference method can lead us to certain knowledge. Historically, it was induction which paved the way to modern scientific thought. However, the inductive method has been countered by Karl Popper during the XX century. In his opinion, the whole scientific method consists in stating hypotheses in order to solve problems. These hypotheses must then face the challenge of evidence, which can corroborate them or prove them to be wrong (“falsify them”, in Popper’s words), but never verify them (prove them to be true), as in the following purely deductive schema: Problem 1 → Hypothesis (tentative solution) → Error elimination (confutation) → Problem 2. Popper’s hypotheses are similar to Plato’s ideas, in the sense that they are inside the scientist mind a priori, even if often they are not universal (some are, like the hypotheses of mathematics, which can be demonstrated to be necessarily true or wrong in a deductive way). Most of the ideas are in fact assumptions, conjectures, like medical or biological theories. Any of these hypotheses can therefore be falsified, sooner or later – and partially or totally rejected – raising new, deeper problems, which will need new, possibly more inventive and courageous attempts of solution (new hypotheses, to be tested against new evidence). According to Popper, the idea that scientific progress can be supported by means of induction is just an illusion, being effectively challenged by different shades of grey crows, white flies, Australian black swans, “inductivist” turkeys (I have to thank Bertrand Russell for this one) and other statistical outliers. Hypotheses are simply triggered by the unexpected, when ideas do not adhere to reality (the observed facts). Apparently, classic Aristotelian logic has had the upper hand on Bacon’s induction.

Still, the birth of information technology was faced by the following challenge: Can machines think? This question was put forward by the father of artificial intelligence, Alan Turing, in a paper published by Mind in 1950: “Computing Machinery and Intelligence”. Actually, the question did not concern the idea that machines could infer necessary conclusions in a deductive way, automatically – that is “mechanically” – since that was given for granted. This is the case, for example, of computation, when a computer obtains the result applying mathematical rules, which were given a priori, to input numbers. Rather, it was the idea that computers could adopt inductive reasoning – that they could learn from experience, empirically – which was being questioned. We can therefore rephrase the question as follows: Can machines have the capacity for abstraction? This capacity is interestingly similar to the human faculty of imagination, the ability of “seeing” something which is not immediately perceived by the senses. It is needed, for example, to solve the strictest CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart). The question reminds us of the Scholastic problem of universals: What do similar objects have in common? What is the essence that they share?

Actually, statistical inference – the methodology by means of which scientific hypotheses are either rejected or accepted –  is mainly inductive in nature. Samples, from which data are collected, must be as large as possible in order for the conclusions to be statistically significant, in which case they have the strength to generalize for an entire population, from which the sample is drawn. As any theory, also Popper’s method is facing a new challenge. In an interesting article published by Wired in 2008, purposefully and prophetically titled “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete”, Chris Anderson tries to challenge the assumption of the deductive nature of the scientific method. He foresees the end of theory, that is of Popper’s hypotheses, to be made before searching for empirical evidence which can only falsify or corroborate them. His opinion is that data can speak for themselves, without any pre-assumption, as those which are needed in classic statistical inference. This idea that any assumption can be eliminated has been effectively rejected by Massimo Pigliucci in his “The end of theory in science?”, published in EMBO Reports  in June 2009. In fact, he is right to the point when he states that collected data are someway selected, that observations are made according to pre-assumptions.

A frequentist approach has led to many advances in science, for example in cryptanalysis. In this trade, analysts do not know the rules by means of which the ciphertext has been coded. They may know the frequency distribution of letters or of their combinations. Since the distribution of the symbols in the ciphertext must resemble that of letters in natural language, they can break the code provided that they are given a large-enough portion of ciphertext. In fact, the probability that a symbol corresponds to a letter is very high if they have the same relative frequency. For example, since e is the most frequent letter in English texts, the corresponding symbol should be the most frequent in the ciphertext. This is true according to the law of large numbers, stating that relative frequency tends to theoretical probability as the amount of evidence increases. Once the cipher has been broken by means of inductive methods, the unveiled rule can be applied deductively to break any other incoming ciphertext. The contribution of Turing in this endeavor is well known, being part of the British Intelligence team – based at Bletchley Park – who de-ciphered the Nazi-German ENIGMA code during WWII.

A similar approach is effectively used by Google translator, allowing users to translate exotic languages without the need for the search engine to know their rules. It is also implemented by other challenging data mining tasks. For example, in classification, algorithms look for rules, like if A then B, and can effectively unveil unpredicted patterns, which may support marketing decisions or also medical diagnosis, provided that the relative frequency – or support – of the rule is high enough. Trivial rules, like if high temperature then flu, may have a strong support and few exceptions, but they are of little use (regardless of the fact that exceptions do not break the rule). Still, other rules may be found which would remain unspotted if investigations were made exclusively on the basis of pre-assumptions.

The big data phenomenon presents both risks and opportunities, including that of an epistemological upgrade of the way we do science today. As the match between induction and deduction goes on, I shall restrain from attempting to put an end to it in this article, leaving the conclusion to Juvenal: Rara avis in terris, nigroque simillima cygno. Corvo quoque rarior albo. A perfect epitome for the scientific method.

 

 

* Davide Barbieri, Dep. of Biomedical Sciences and Surgical Specialties, University of Ferrara (Italy), davide.barbieri@unife.it

Share Button

Top 5 by Venus in Arms – week 1

Wars in Iraq and Afghanistan produced, at least, several lessons for students of war. One of his protagonists, General Stanley McChrystal, comes up with lessons on leadership where he strongly indicates how the human factor is still what makes good armies. You can find a summary in this interesting piece.

From land to air, with Dan Lamothe’s piece on the difficult role of the US in helping out Central American countries such as Honduras fighting drug trafficking. Should they assist “shady” militaries for a greater good?

This week focus is on data and access to data, a hot theme in cyber-related stuff. Big Data is the term of the year, perhaps. Here, you find the “undercover economist” Tim Harford giving a critical look at the issue.

We stay tuned on technology, which affects war-making but also, possibly, war consequences. At MIT, recent research blending math, 3D printing, and concern for post-conflict health might help people with amputated limbs to recover.

Random suggestion. For those passionate of geography and maps, check out New York Public Library’s latest open access project, NYPL Map Warper. Thousands of original maps, and you can also work on them and download.

Share Button