Chat with us, powered by LiveChat The Seven Myths of Terrorism Paper - Credence Writers
+1(978)310-4246 [email protected]

Description

Your response can be in writing your understanding of the myths, showing a great understanding and elaborating on your reflection on the reading. Your work should read well, be free of grammatical errors and typos and would read clearly, be free of repetition, show your understanding of the reading. Mention the 7 myths and your opinion

Technometrics
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/utch20
Debunking Seven Terrorism Myths Using Statistics
Stan Lipovetsky
To cite this article: Stan Lipovetsky (2021) Debunking Seven Terrorism Myths Using Statistics,
Technometrics, 63:1, 137-140, DOI: 10.1080/00401706.2020.1864998
To link to this article: https://doi.org/10.1080/00401706.2020.1864998
Published online: 26 Jan 2021.
Submit your article to this journal
Article views: 934
View related articles
View Crossmark data
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=utch20
BOOK REVIEWS
is knowing how to merge datasets and to how appropriately
tackle the problems that comes with merging. A very practical
and helpful chapter.
In a nutshell, I can safely conclude that all the chapters are
nicely structured. To get the maximum benefit of this book,
the readers should have an introductory course in statistics and
statistical computation. In summary, this is a good contribution,
providing up-to-date coverage on selected topics of data management in a logical and systematic manner.
Feryaal Ahmed
London, Canada
Debunking Seven Terrorism Myths Using Statistics, by
Andre Python. Boca Raton, FL: Chapman and Hall/CRC
Press, Taylor & Francis Group, 2020, xvii+132 pp., $100.00
(hardback), $35.96 (eBook), ISBN: 978-0-367-47228-3
(hardback).
The monograph belongs to the ASA-CRC Series ?Statistical
Reasoning in Science and Society,? and it is devoted to statistical
studies of the terrorism which generally can be defined as the
deliberate creation and exploitation of fear through violence or
the threat of violence in the pursuit of political change. The book
consists of the following chapters.
Introduction is titled The Role of Statistics in Debunking
Terrorism Myths. It states the aim of the book?communicating
knowledge on terrorism obtained from the analysis on terrorism
data. Since the deadliest coordinated attacks in September 2001
perpetrated by the nonstate agents when almost 3000 people
were killed in the NYC World Trade Center, Pentagon, and
Pennsylvania, some large databases were created to get insight
into terrorism. The key statistical characteristics and main finding on the terrorist events are described in this book organized
in nine chapters which besides the introduction and conclusion
deal with the so-called seven myths about terrorism.
Myth 1: We Know Terrorism When We See It. The chapter
highlights how different views on terrorism change the perception of the observed patterns of actual data. Till now no
consensus on terrorism definition has been reached in academia
or governments, and even the definition suggested by the UN
Security Council in 2004 remains nonobligatory toward Member States. The most frequently cited concepts of terrorism
include violence-force, political, fear-terror, and threat. Four
main characteristics can be indicated: 1?political aim behind
the acts; 2?fear generated; 3?publicity needed; 4?civilian
target deliberately taken. Terrorism and mass violence can be
conducted by the state totalitarian regime which can also use
the non-state actors providing them support in money, weapon,
military advice, and training. That is a state-sponsored terrorism, but this book considers mostly the nonstate terrorism.
Investigation on terrorism uses several sources with the geolocalized data, and those are: the Global Terrorism Database
(GTD), the Rand Database of Worldwide Terrorism Incidents
(RDWTI), and the Global Database of Events, Language, and
Tone (GDELT). There are other databases, for example, the
137
International Terrorism: Attributes of Terrorist Events (ITERATE), but they do not include geographical localization of the
target city or other coordinates. There are multiple discrepancies across the databases because of different sources of information and diverse definitions used for classification of the
events. The first three databases are compared by mapping
the terror events on the globe in 2002?2009, and the results
even do not superimpose, but differ in the number of deaths,
wounded, type of weapon, names of perpetrators, etc. Several
other maps are presented on the events in seven countries of
North Africa in 2002?2017, and the assaults on civilian and
non-civilian targets diverge noticeably by the numbers of attacks
and deaths measured in many hundreds. State repression against
nongovernment targets leading to lethal events are studied by
the Uppsala Conflict Data Program (UCDP) which collected the
data available via the Georeferenced Event Dataset (GED). For
example, the Democratic Republic of Congo (DRC) encountered 526 lethal state attacks and 210 lethal non-state terrorist
attacks, but the data from GED and GTD differ significantly that
make comparisons challenging. Another aspect of the political
versus nonpolitical terrorist events was studied by the GTD
data on Pakistan with mapping the numbers of attacks across
the country. Pakistan hosts notoriously known home-based and
foreign-based terrorist groups and organizations, including several branches of Al-Qaeda. In 2002?2017, there were 9790 politically motivated terrorist attacks, while 146 attacks had other
goals. Terrorism can also be distinguished by other classification
criteria, for example, by size of the groups, or country of the
perpetrators? origin, etc. Recent approaches to improvement
empirical measurements of conflict events were performed with
help of the R statistical language implemented in the package
meltt?Matching Event Data by Location, Time, and Type.
Myth 2: Terrorism Only Aims at Killing Civilians. According
to GTD, the world?s most extensive unclassified database, in
2002?2017 there were 75,906 terrorist attacks, in a half of
those about 190,000 people were killed, but in another half
only injuries happened. The median number of deaths is 1
and the mean is 2.5 per terrorist attack. The proportion of
lethal versus nonlethal attacks varies by continents: Africa
62%, Asia 54%, South America 35%, North America 26%,
and Europe 21%. There are different classifications of attacks
by total causalities, for example, into two classes of less than
5 killed or wounded, versus 5 or more. There are 2.5 times
more low causality than high causality events (52,890 vs.
20,425, respectively). Mass-causality attacks that killed a large
number of people are rare. Another classification defines
three categories: low-causality attacks (22,906 events, no death
or wounded), intermediate causality attacks (41,146 events,
with up to 10 deaths or wounded), and high-causality attacks
(9263 events which count more than 10 deaths or wounded).
Asia and Africa tend to encounter the intermediate causality,
while Europe rather exhibits the low-causality class of attacks.
Many questions remain unanswered, for instance, should
other factors, such as economic loss be added to estimate the
magnitude of terrorist events, or other classifications should be
used. While most terrorist groups do not seek to provoke a high
number of causalities for not to jeopardize its own survival
in the population, some of them may inflict large human
losses. Mapping the numbers of causalities (measured in dozen
138
BOOK REVIEWS
of thousands) reveals the five most dangerous organizations,
in order: the first place takes The Islamic State of Iraq and
the Levant (ISIL, aka ISIS, produced twice more deaths than
Taliban; founded in 2004; the main goal?to create an Islamic
caliphate across Iraq and Syria), then goes Taliban (Afghanistan;
founded in 1989; the main goal?to remove foreign military
occupation in the country), Boko Haram (Nigeria, founded in
2002; the main goal?to establish an Islamic government with
strict Salafist sharia in the country), Al-Qaeda in Iraq (founded
in 2004; developed into ISIL), and Tehrik-i-Taliban Pakistan
(TTP, founded in 2007 by a shura council of 40 Taliban leaders,
but the movement was known from 1998; the main goals?to
apply sharia and combat the coalition forces in Afghanistan).
Due to GTD, the most causalities are perpetrated by unknown
terrorist groups, and uncertainty remains in identification of the
most lethal groups.
Myth 3: The Vulnerability of the West to Terrorism. The strong
influence of the global Western media across the world yields
more coverage of the events in the West that produces the
impression of more terrorism occurring there. Actually in 2002?
2017, the 75% of terrorist attacks took place in Asia (including
the Middle East), 15% in Africa, 7% in Europe, 2% in South
America, and about 1% in North America and Oceania together.
In the same time period of 16 years, the life losses are distributed
similarly: terrorists killed 140,000 people in Asia, more than
43,000 in Africa, about 5000 in Europe, and totally in S. and N.
Americas plus Oceania about 2000 individuals. Moreover, the
events are concentrated in a few locations in the Middle East
and Africa, with the highest number of terrorist attacks in Iraq,
Pakistan, India, Afghanistan, and Philippines. By the numbers
of deaths, the most of terrorism victims had been living in Iraq,
Afghanistan, Pakistan, Nigeria, Syria, and Somalia. Among the
top 20 countries with the maximum number of lethal attacks,
half of the countries are in Asia, a quarter in Africa, two in
Europe (Russia and Ukraine), one in S. America (Colombia),
but none in N. America or Oceania. Within 124 countries
encountered lethal attacks, the United States stays on the 36th
rank with total of 72 of such events, the United Kingdom and
France take the 44th and 45th places with 33 and 31 lethal
attacks, respectively. Within the considered 16 years, the United
States encountered less than 0.5% of the lethal assaults happened
in Iraq, less than 3.7% of those in Somalia, or 6.2% of those
chanced in Yemen. The most targeted by terrorism cities are:
Baghdad, Mosul, and Baqubah in Iraq, Karachi in Pakistan,
and Mogadishu in Somalia. Comparison of various kind of outbreaks is also performed on Bagdad versus Paris: armed assault,
assassination, bombing, explosion, hijacking, hostage taking,
kidnapping, and other kind of terrorist attacks. In general, for
every seven lethal terrorist attacks in the world, one of them
occurs in Baghdad, the capital of Iraq.
Myth 4: A Homogeneous Increase of Terrorism over Time. By
the same data source for 2002?2017, the patterns of changes
in the number of events and associated number of fatalities
are studied on monthly bases to identify possible trends. Also,
cyclical regularities could occur because after a series of terrorist
attacks a public opinion can force governments to take counterterrorism actions which reduces the terrorism activity. During
this time terrorists can make rearrangements in organization,
recruit new members, and prepare new attacks leading to cycles
of violence. Finding such regularities can help authorities to
anticipate potential activity, and elaborate on defensive measures. Using R package forecast, the monthly trends were identified by a two-sided moving average (the procedure known as 2
? MA) smoother based on values of each 6 previous and 6 next
months. Multiple mappings are presented for demonstration
terrorist attacks profiled by years and months via actual data
and averages, for various spatial scales of continents, countries,
and cities. The curves on the maps illustrate that most of attacks
took place in Asia, then a lower curve corresponds to Africa, and
even lower?to Europe, etc., and all the curves have a maximum
at about 2009, and even a higher maximum is located around
the year 2015. Worldwide curve of terrorist activity in 2014?
2017 goes higher than in 2002?2014, with a similar pattern in
Asia and Africa, and with a lower curve for Europe. A much
smaller numbers of monthly terror attacks are observed for N.
and S. Americas and Oceania. The monthly average number of
deadly causalities has a consistent growing trend in the world,
Asia, Africa, and even Europe between 2010 and 2014, with
a relative decrease to 2017, although it is much higher than
in the period 2002?2014. There is no temporal pattern in the
other continents. High levels of terrorism persist in very few
countries, particularly, Iraq and Pakistan, and in the most targeted cities, such as Bagdad. Some methodological problems of
data collection and adjustment are also discussed. In general, no
explicit patterns in temporal behavior of the events were identified, but as known??absence of evidence is not evidence of
absence.?
Myth 5: Terrorism Occurs Randomly. In contrast to general
opinion, the spatial locations of terrorist attacks are deliberately
taken for harming anybody present in such a location, thus,
the terrorists do not commit actions of pure chance, anywhere
at any time. On the scale of a city level, clusters of terrorist
events? locations can be studied by spatial statistical models of
a complete spatial random (CSR) process, with spatial autocorrelations related to the Tobler?s first law of geography: ?everything is similar, but near things tend to be more similar.? In
presence of autocorrelations, the errors are not independent and
identically distributed (iid), the observations are not homogeneous, the ordinary least squares (OLS) regression parameters
are inefficient, estimations in statistical tests are biased, and
lead to wrong inference. Special spatial statistical models can
account for the dependency present in the data and improve
the accuracy of the parameter estimation, although it is difficult
to include all factors that affect a social phenomenon. Spatial
data can be viewed via suitable ?lenses? which correspond to
the point process, lattice, and geostatistical models. The identification of terrorist clusters has been performed already from
1980, with estimation of factors influencing the future assaults,
and finding the locations of a higher risk. Point process has
been successfully used for studying in space and time such
social phenomena as crime, conflicts, insurgency, and it can
be applied for terrorist events as well. For numerical examples
on numerous events, the urban areas of two Iraq cities were
studied?Baghdad and Mosul. Between 2002 and 2017, there
were 21,238 terrorist attacks in Iraq, and among those the most
targeted was Baghdad with 8632 attacks. CSR was modeled by
BOOK REVIEWS
the spatial homogenous Poisson process for the urban area of
the capital of Iraq, with additional simulation for the locations
of special objects, such as the U.S. embassy and military bases
with a higher probability of assault. Terrorist aims can be divided
to such groups as symbolic (to hurt psychological targets such
as specific individuals, government representatives), functional
(harmful to terrorist organization persons like politicians, military and police officers), logistical (to get money, weapons), and
expressive targets. Frequency of attacks by type of targets were
replicated, with such numbers: private citizens and property
(3242 cases), business (1140), government general (872), police
(682), military (359), transportation (258), religious figures and
institutions (233), with their total share about 75% of events, and
a dozen of other main aims. In total, the terrorist groups tend
to favor specific targeted locations yielding clustering patterns
observed at various temporal and special scales.
Myth 6: Hotspots of Terrorism are Static. Spatio-temporal
analysis for a change over time of the size and location in
hotspots areas of high risk?s terrorist attacks leads to investigate
diffusion process in the neighborhood of such events. Similar
to chemical process of diffusion where a flow from areas of
high concentration goes around to areas of lower concentration,
terrorism violence can spread from place to place. In analogue
with infection deceases, contagious and noncontagious diffusion processes can be studied. Contagion requires an event
which can be potentially triggered and a direct contact in a
special proximity. Noncontagious diffusion can be triggered by
imitation and spread by mass media, transnational collaboration, financial support via internet, population density, and the
dynamics increased by electronic correspondence. The terrorist
groups may also reveal such a phenomenon as relocation diffusion in which they cease to target in the vicinity and transfer
their activity beyond the close neighborhood, then identification of the relocated activity could become challenging. Such
processes were observed by terrorism in neighboring states,
such as Lebanon-Israel, Colombia-Peru, India-Pakistan, and
France-Spain (Spanish organization ETA). Islamic State (ISIS)
was the world most active terrorist group in 2017, but over the
year its activity declined in different parts of Iraq. Changes in
its activity were studied via the point process approach of CSR,
with estimations of the magnitude and extent of the clustering
process. Pair correlation function (pcf) of ISIS events in Iraq
for various values of radius r (in geographical degree) were
calculated and plotted depending on r, and this diminishing
curve illustrates that the attacks are highly clustered at very
close proximity of 0.5 degree (about 50 km). Other maps with
multiple panels present the pcf behavior and the terrorist activity
locations in the country profiled by month/s of 2017, and show
the activity decline to the end of this time, with clustering
diminishing to about 22 km. The log-Gaussian Cox process
(LGCP) is also employed to stochastic modeling for the spatial
distribution of ISIS attacks during this time period, permitting
to analyze the effects of socio-economic and environmental
factors (e.g., population, altitude, terrain conditions, etc.) in
the process. The results show that the intensity of the events
increases in regions of lower altitude, as Baghdad and Mosul are,
which can be easily reached by ISIS. The attacks are clustered
within the areas of other than ISIS terrorist groups activity,
139
which could correspond to their cooperation. A first-order autocorrelation process AR(1) was also tried for taking into account
that intensity of attacks in a given month can depend on the
intensity in the previous month. The obtained dynamic patterns
are in a good correspondence with the observed data, and they
suggest that activity is higher in April?June but shrink in the
second half of the year.
Myth 7: Terrorism Cannot be Predicted. The causes of terrorism are multiple, with ideology and belief as the main factors varying among individuals, groups, regions, national and
transnational levels, and changing over time. It is hardly possible
to predict individual acts, but at the aggregated levels of data the
statistical and stochastic modeling and analysis make possible
to predict terrorism patterns and forecast trends of insurgent
movements in time. Three main requirements for accuracy of
prediction models are: data quality, data quantity, and theoretical knowledge for mechanism behind the observed data. Artificial intelligence (AI) and machine learning (ML) algorithms
for studying complex data can help counterterrorism forces to
analyze networks and identify terrorists by sophisticated image
and voice pattern recognition. A case study of ML application
to predict terrorism in Iraq, Iran, Afghanistan, and Pakistan
by 2002?2017 data on events aggregated within a radius of
0.4 degree reflecting the GTD uncertainty of geolocation. The
study area is discretized by the Voronoi diagram (aka Dirichlet
tessellation) with 706 polygons. The data were taken from such
sources as the Gridded Population of the World (GPW) on cities
with more than 50,000 inhabitants with potential targets, NOAA
satellite night lights as a widely used proxy for human activity,
and PRIO-GRID on geographic and socioeconomic variables,
including locations of the oil fields, closeness to state borders,
measures of democracy levels, ethnicity fractionalization, etc.
The aim of prediction was: to estimate by historical past data
if one or more terrorist attacks occurred or not a week ahead
in each of 187,090 predicted cell-weeks ahead. The gradient
boosting ML algorithm XGBoost was applied, and the results
were mapped for prediction in week 1, week 10, week 100,
and week 500. For the four considered countries, the Voronoi
polygons with colored areas denote the outcomes: true positive
(attacks were observed and correctly predicted by model), true
negative (no attacks happened and zero attacks predicted), false
negative (attacks were observed but not predicted), and false
positive (no attacks observed but the model predicted their
occurrence). The model yields about 82% accuracy which is the
number of correct predictions (true positive plus true negative)
divided by the total number of observations. The improvements
in data and models should be focused on reducing the most
important false negative cases of non-predicted attacks.
The conclusion in the last 9th chapter is called Terrorism:
Knowns, Unknowns, and Uncertainty. It finalizes findings
on the analysis of terrorism data gathered from the most
comprehensive dataset GTD, and states that this complex
phenomenon of modern human society remains an ambiguous
concept requiring more elaborated scientific definitions. It
demystifies the common believes that terrorism is only about
killing civilians and it mostly strikes Europe and North America,
that it only increases over time, or that attacks occur randomly
and are unpredictable. The results of various statistical modeling
140
BOOK REVIEWS
approaches demonstrate that it is possible to obtain a much
more precise picture of terrorism activity and forecast it.
Each chapter suggests mathematical definitions, glossary,
and additional reading sources. Besides those, the book supplies
with bibliography of 153 most recent works and multiple
links to the internet sites. The book presents an incredibly
fascinating research, and can be interesting and useful not
only to specialists but to general public for understanding and
making informed judgments on terrorism and its debunking
with help of statistical data analysis and prediction to prevent
future attacks.
Stan Lipovetsky
Minneapolis, MN
The Equation of Knowledge: From Bayes? Rule to a
Unified Philosophy of Science, by L? Nguy?n Hoang.
Boca Raton, FL: Chapman and Hall/CRC Press, Taylor &
Francis Group, 2020, xxi+438 pp., $64.95 (hbk), ISBN:
978-0-367-42815-0.
The monograph is devoted to the Bayesian statistical data
analysis in the inductive reasoning to infer scientific knowledge.
In contrast to the classic logic with true or false dichotomy in
deductive thinking, the Bayesian-based epistemology operating
with estimation of continuous probability values is more adequate to complex problems with which human mind deals in the
contemporary sciences from physics and biology to social and
economics disciplines. The Bayesian philosophy in approach
to various scientific and practical problems (the author calls
it Bayesianism) has been becoming a compelling approach in
the modern framework of reasoning. The famous Bayes? rule
serves for updating the knowledge credence with new empirical
data which can support or contradict the achieved theories, and
Bayes? formula can actually be called the equation of knowledge.
Besides the pure Bayesianism, the rise of artificial intelligence
through machine learning and massive data available nowadays
produces the new abilities of numerous state-of-art algorithms
and leads to combining computer science and probability theory
into a wide variety of approaches called pragmatic Bayesianism.
The monograph consists of 22 chapters arranged in 4 sections. Section I of Pure Bayesianism starts with Chapter 1 On
A Transformative Journey, where the author shares his enthusiastic discoveries on helpfulness of Bayes? law in his teaching
and life experience, claims that Bayesianism is a universal philosophy of knowledge, discusses the subjective frequency as a
model-dependent probability, and describes the structure of the
book. Chapter 2 Bayes? Theorem, presents several illustrations
of conditional probability estimations. One example is a known
puzzle: a man has two kids, one is a boy, what is the probability
that the other is a boy too? Suppressing an intuitive impulse
to a solution 1/4 or 1/2, denote a boy as b, and a girl as g, and
from all the variants of two kids bb, bg, gb, and gg, exclude the
last one, then the bb probability equals 1/3. The next are the
famous Monty Hall problem and its correct solution proposed
by Marilyn vos Savant, the trial case of Sally Clark and the
legal arguments, the reliability of a test asserting Ebola disease
evaluated for the posterior conditional probability in the Bayes?
formula expressed via the prior and likelihood terms. Chapter 3
of Logically Speaking? considers Aristotle?s syllogism in logical
deduction and the inference rules decomposed to the steps of
universal instantiation and modus ponens, with other properties of modus tollens and contrapositive, and propositions
expressed via quantifiers and predicates. Axioms of Peano for
natural numbers and of Zermelo?Fraenkel (ZF) and its extension with the axiom of choice (ZFC) are noted in the context
of G?del incompleteness theorem, which asserts that there exist
formulae that axioms cannot prove or disprove, including ZF
and ZFC. Classical logicians, aka Platonists, typically interpret
G?del?s theorem as a deficiency of axioms, so there are true
theorems without proof. For intuitionists, aka constructivists,
G?del?s theorem asserts that in any theory there are sentences
for which no proof in favor or against it can be constructed.
Besides G?del?s theorem, the intuitionists reject all nonconstructive proofs that Platonists proved, for instance, the Banach?
Tarski paradox, the existence of bases of vector spaces, or the
uniqueness of algebraic closures. The Bayesian deductive logic
is neither classical nor intuitionist, it is not limited to 0 and
1 values, so it exceeds the true-false dichotomy, and allows a
generalization to operating with different degrees of certainty.
Thus, the binary language of classical logic is limited because
it ignores the extent of confirmation or the magnitude of rejection. The Bayesian approach can incorporate different theories?
predictions into the weighted estimates by the ensembling or
bagging techniques known in machine learning. Chapter 4 of
Let?s Generalize! recaps David Hume and Karl Popper works
in epistemology, Karl Pearson, Egon Pearson Jerzy Neyman,
and Ronald Fisher on frequentist understanding of probability,
hypotheses testing, problems with p-values for big data, and phacking for scientifically publishable results. The equation of
knowledge, which is the Bayes formula, is suggested for checking the how adequate is a Theory to the Data:
P(Theory|Data)
=
P(Theory|Data)
,
P(Data|Theory)P(Theory) + Alter P(Data|Alter)P(Alter)
where Alter denotes all the alternative theories. From the prior
P(Theory), with the likelihood P(Data|Theory) and partition
function P(Data) which equals the expression in the denominator, the posterior probability P(Theory|Data) of the Theory
supported by Data yields. In the cumulative process of integration of New collected data to refine the credence of the Theory,
the Bayesian inference or update is presented as follows:
P(T|News and D) =
P(News|T and D)P(T|D)
,
P(News|D)
where T and D denote the Theory and Data, respectively. When
the new data News have been obtained independently from the
past data D, the Bayesian inference reduces to
P(T|News and D)
=
P(News|T)P(T|D)
.
P(News|T)P(T|D) + Alter P(News|A)P(A|D)

Purchase answer to see full
attachment

error: Content is protected !!