top of page
  • Black Facebook Icon
  • Black Twitter Icon
  • Black LinkedIn Icon

StratComAPAC

             2019

 speakers

stratcomlogofinal_edited.png

24 - 25 April Singapore

Photocredits: hackernoon.com

SPEAKERS
Carley_Kathleen_edited.jpg
Prof. Dr. Kathleen M. Carley
Professor of Computer Science in the Institute  for  Software  Research, IEEE Fellow and  Director  of  the  Center  for  Computational Analysis of Social and Organizational Systems, Carnegie  Mellon University

 

Dr. Carley is a Professor of Computer Science in the Institute  for  Software  Research,  IEEE  Fellow , and  Director  of  the  Center  for  Computational Analysis of Social and Organizational Systems at Carnegie  Mellon University.  She joined Carnegie Mellon in 1984 as Assistant  Professor  Sociology  and  Information  Systems.    In  1990  she  became  Associate Professor of Sociology and Organizations,  in 1998 Professor of  Sociology,  

Organizations,  and  Information  Technology,  and  in  2002,  attained her current role as Professor of 

Computation, Organization, and  Society.  She is also the CEO of Carley Technologies Inc.  aka Netanomics.     Dr.  Carley’s  research  combines  cognitive  science,  sociology,  and  computer science to address complex social and organizational issues.   Her  most  notable  research  contribution  was  the  establishment  of Dynamic Network Analysis (DNA) and the associated theory and methodology for examining large 

highdimensional time variant networks.  Her research on DNA has resulted in tools for analyzing large‐scale  dynamic networks and various multiagent simulation systems.  She has led the development of tools for  

extracting sentiment, social and semantic networks from social media and other textual data (AutoMap  & NetMapper), simulating epidemiological models (BioWar), and simulating changes in beliefs and  practice given information campaigns (Construct).  Her ORA system is one of the premier network analysis  and  visualization  technologies  supporting  geomtemporal  analysis  of  social  network  and  highdimensional/meta network data. It includes special features for handling small and big data, social media  data, and network dynamics. t is used worldwide. Illustrative projects include assessment of fake news and social cyber security threats, IRS outreach, impact of NextGen on airline re-reouting, counterterrorism modeling, counter-narcotics modeling, health analytics, and social media based assessment of crises such as Benghzai, Darfur and the Arab Spring.

Speech Abstract

Information Maneuver Assessment

Our opinions are shaped by what we read, watch, experience and whom we interact with.  Today, interactions within social media are part of the fabric which shapes our opinions.  Sharing narratives in social media, and forming new narratives is key to impacting opinions shared by groups. Many focus on message campaigns as a way of understanding how groups are influenced in social media.  However - the really impactful information maneuvers in social media are not just about shaping the narrative, they are also about shaping the group.  This talk describes how actors can exploit human social cognition, decision biases, and features of social media technology to influence groups.  The BEND characterization of information maneuvers is described.  Bend classifies information maneuvers into those that shape the group by altering the social network - who is interacting with whom when, and that shape the narrative by altering the knowledge network - what ideas are linked to each other and by whom.  The role of some tools in effecting these maneuvers, e.g. Bots and memes, is illustrated.

Assoc. Prof. Dr. Ullrich Ecker - Biography
University of Western Australia’s School of Psychological Science

Associate Professor Ullrich Ecker is a cognitive psychologist studying memory and reasoning at the University of Western Australia’s School of Psychological Science. He teaches Cognitive Psychology and Research Communication Skills, and is the School’s Director of Community and Engagement.

 

Ullrich obtained his PhD in 2007 from Saarland University (Germany). He is now recognized internationally as an expert on the effects of misinformation on people’s reasoning and decision making. His research is providing in-depth understanding of the challenges we face with modern media, and our ability to rationally assess the information presented to us. Understanding the issue of misinformation is particularly important when disseminating corrective information to counteract false and misleading claims. Ullrich’s work hence has implications for education, journalism, science communication, and the design of public information campaigns.

 

Ullrich has produced more than 50 peer-reviewed publications in international journals of the highest caliber, and his work has been cited more than 3,000 times. He has been awarded a prestigious Australian Research Council fellowship, as well as the 2011 Outstanding Young Investigator and the 2014 Vice-Chancellor’s Mid-Career Research Award from the University of Western Australia

Photo_UllrichEcker.jpg
Speech Abstract

The Psychology of Misinformation

In this talk, I will discuss the origins of misinformation, and reasons why misinformation is seen as a growing problem in the contemporary media landscape. I will explain how misinformation can influence human memory and reasoning even after it has been retracted or refuted. I will discuss factors such as cognitive biases and communication formats that make corrections more or less effective. I will explain how sometimes corrections can even backfire, ironically strengthening misconceptions they aim to reduce. Finally, I will suggest some techniques that can help combat misinformation, and I will highlight some of the challenges that scholars, fact-checkers, and communicators face in the so-called post-truth era.

Nitin.jpg

Prof. Dr. Nitin Agarwal

Jerry L. Maulden-Entergy Endowed Chair & Distinguished Professor

Nitin Agarwal is the Jerry L. Maulden-Entergy Endowed Chair and Distinguished Professor of Information Science at University of Arkansas at Little Rock. He is the founding director of the Collaboratorium of Social Media and Online Behavioral Studies (COSMOS) at UA Little Rock. His research aims to push the boundaries of our understanding of digital and cyber social behaviors that emerge and evolve constantly in the modern information and communication platforms. At COSMOS, he is leading projects funded by over $10 million from an array of federal agencies including NSF, ONR, ARO, AFRL, DARPA, and Department of State, and plays a significant role in the long-term partnership between UA Little Rock and the Department of Homeland Security. He developed publicly available social media mining tools, viz., Blogtrackers, YouTubeTracker, and Focal Structure Analysis used by NATO Strategic Communications and public affairs, among others. Dr. Agarwal participates in the national Tech Innovation Hub launched by the U.S. Department of State to defeat foreign based propaganda.

Visit http://ualr.edu/nxagarwal/ for more details.

Speech Abstract:

Deviant Mobs of the Internet: Tactics, Techniques, and Procedures

Social media platforms are widely used for sharing information. Although social media use is generally benign, such platforms can also be used for a variety of malicious activities, including the dissemination of propaganda, hoaxes, and fake news to influence the public. The availability of inexpensive and ubiquitous mass communication tools has made such malicious activity much more convenient and effective. This talk will touch upon our various research efforts that demonstrate how such disinformation campaigns work, examine the critical link between blogs and other social media platforms (viz., YouTube, Twitter, Facebook, VKontakte, etc.), and the different media orchestration strategies. Using socio-computational models that leverage social network analysis and cyber forensic analysis, prominent information actors and leading coordinators of disinformation campaigns are identified. Further, the talk will highlight the tactics, techniques, and procedures used by the deviant groups to propagate disinformation including the use of machine-driven communications (MADCOMS), e.g., bots. Of the several case studies the research methodology has been applied to, the talk will illustrate massive disinformation campaigns pertaining to the Baltic region and NATO’s military exercises, conducted primarily through blogs (and vlogs) but strategically linking to a variety of other social media platforms.

ommer___.jpg
Prof. Dr. Björn Ommer
Professor of Computer VisionHeidelberg Collaboratory for Image Processing (HCI) & Interdisciplinary Center for Scientific Computing (IWR),Universität Heidelberg

Björn Ommer is a full professor for Scientific Computing and leads the Computer Vision Group at Heidelberg University. He has studied computer science together with physics as a minor subject at the University of Bonn, Germany. His diploma (~M.Sc.) thesis focused on visual grouping based on perceptual organization and compositionality. After that he pursued his doctoral studies at ETH Zurich Switzerland in the Pattern Analysis and Machine Learning Group headed by Joachim M. Buhmann. He received his Ph.D. degree from ETH Zurich in 2007 for his dissertation "Learning the Compositional Nature of Objects for Visual Recognition" which was awarded the ETH Medal. Thereafter, Björn held a post-doc position in the Computer Vision Group of Jitendra Malik at UC Berkeley. He serves as an associate editor for the journal IEEE T-PAMI and previously for Pattern Recognition Letters. Björn is one of the directors of the HCI and of the IWR, principle investigator in the research training group 1653 ("Spatio/Temporal Graphical Models and Applications in Image Analysis"), and a member of the executive board and scientific committee of the Heidelberg Graduate School HGS MathComp. He has received the Outstanding Reviewer Award at ICCV'15, CVPR'14, ICCV'13, CVPR'11, and CVPR'10 and has served as Area Chair for ECCV'18. Björn has organized the 2011 DAGM Workshop on Unsolved Problems in Pattern Recognition.

Speech Abstract

Deep Self-Supervised Disentanglement for Visual Synthesis: Great Potential & Great Challenges

Deep generative models open up entirely new avenues for automatic image understanding as well as for image and video synthesis. The talk will highlight principle reasons that limited the potential of such models until very recently and discuss a novel solution. I will present a fully automatic approach to deep learning based on self-supervision and show the immense potential of disentangled representation learning. Example applications include visual synthesis for image and video retargeting and artistic style transfer. Besides showing the benefits of this new technology, I would also like to discuss the implications and challenges from a societal perspective.

Edson.jpg
Assoc Prof Edson C Tandoc Jr.
Nanyang Technological University's College of Humanities, Arts, & Social Sciences

Edson C. Tandoc Jr. (Ph.D., University of Missouri) is an Associate Professor at the Wee Kim Wee School of Communication and Information at Nanyang Technological University in Singapore. He is also an Associate Editor of Digital Journalism, the Chair of the Newspaper and Online News Division (NOND) of the Association for Education in Journalism and Mass Communication (AEJMC), and the incoming Secretary of the Journalism Studies Division of the International Communication Association (ICA).

His research focuses on the sociology of message construction in the context of digital journalism. He has conducted studies on the construction of news and social media messages. His studies about influences on journalists have focused on the impact of journalistic roles, new technologies, and audience feedback on the various stages of the news gatekeeping process. For example, he has done some work on how journalists use web analytics in their news work and with what effects. This stream of research has led him to study journalism from the perspective of news consumers as well, investigating how readers make sense of critical incidents in journalism and take part in reconsidering journ-


 


alistic norms and how changing news consumption patterns facilitate the spread of fake news. 
He earned his undergraduate journalism degree from the University of the Philippines (Summa cum Laude) and his master of mass communication degree from the Nanyang Technological University, where he was awarded the Media Development Authority Award (top MMC graduate). He also worked as a newspaper journalist for six years prior to his doctoral studies.

Speech Abstract

Fake Vs. News - How does fake news look like compared to real news?

Journalism has constantly engaged in policing its boundaries—more so now when such boundaries have arguably become more porous due to changes in communication technologies and journalist-audience relationships. Such boundary work is particularly pronounced when it comes to news production. While news has distinguished itself from other forms of writing by emphasizing the binary of fact versus fiction, claiming that news is based on facts, it has also established other rules to mark its boundaries. For example, news claims to adhere to the norm of objectivity, enacted through the inverted pyramid format, and invoked by reliance on news values as determinants of what becomes news. Such rules have demarcated the line between what counts as news and what does not, perpetuating and protecting the boundaries of news.

 

But in the last two years, real news has faced challenges from the rise of fake news. Following the 2016 elections in the United States, fake news has become a prominent topic of public discussion. Not only does real news face threats from fake news in terms of audience engagement— a growing number of studies find that fake news articles tend to be shared more often than real news articles—but real news organizations now also find themselves having to defend their ranks from being branded by politicians unhappy with their news coverage as fake news producers.

 

Scholars have highlighted the “fakeness” of fake news by illuminating the kinds of deception involved and the motivations of those who deceive, consistent with the fact vs. fiction binary. But this current study looks at the “newsness” of fake news by examining the extent to which it imitates the characteristics and conventions—the rules—of traditional journalism when it comes to news. Through a content analysis of 886 fake news articles, we find that in terms of news values, topic, source, and format, fake news articles look very much like traditional—and real—news. The majority of fake news articles included the news values of timeliness, proximity, negativity, and prominence; were about government and politics; referred to establishment sources; and were written in an inverted pyramid format. However, one point of departure is in terms of objectivity, operationalised by previous studies as well as in this current study as the absence of the author’s personal opinion. The analysis found that the majority of fake news articles included the personal opinion of their author or authors.

Jason.jpg

Assoc. Prof. Jason Vincent A. Cabañes

De La Salle University, Manila, Philippines

Jason Vincent A. Cabañes is Associate Professor in Communication and Research Fellow at De La Salle University—Manila in the Philippines. Alongside his primary research interest on the mediation of cross-cultural solidarities and intimacies, he also does work on the conditions of digital labour in the Global South. In line with the latter, he recently co-led in the digital labour stream of the British Council-funded Newton Tech4Dev Network. During this time, he co-authored the public report The Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines as well as the chapter “The rise of trolls in the Philippines (and what we can do about it)” for the edited book A Duterte Reader. He also has an upcoming co-authored chapter “Fake news and scandal’ in The Routledge Companion to Media and Scandal.  His other works have appeared in top tier publications such as New Media and Society, Media, Culture, and Society, and the International Journal of Cultural Studies.

Anderson and Jonathan Corpus Ong about the four ways we might better approach digital disinformation. As an empirical anchor for this discussion, I focus on the the rise of networked disinformation in the Philippines. Through this case, I underscore the value of looking at digital disinformation (1) not only as textual but as also visual, (2) not only as discrete sets of information but as embedded in broader social narratives, (3) not only as content but as an instantiation of cultural production, and (4) not only from a "Western-centric” frame but also from a global and comparative perspective. I also tease out the implications of these conceptual innovations for how we might generate bespoke and local solutions to this pernicious phenomenon. 
Speech Abstract:

Innovations in conceptualising digital disinformation

In this presentation, I preview the upcoming piece I have co-authored with C.W. 

Aimphoto.jpg
Dr. Aim Sinpeng
Lecturer, The University of Sydney's Department of Government and International Relations 

Dr Aim Sinpeng is a Lecturer in the Department of Government and International Relations and a Co-Founder of the Sydney Cyber Security Network at the University of Sydney. She has published widely on social media and politics in Southeast Asia and is currently managing a large grant-funded project on disinformation, trust and political participation in Southeast Asia.

 

Speech Abstract

Users' Behavior and Trust in Social Media

Existing studies on the spread of disinformation on social media tend to overlook user's personal attributes that may make them more likely to engage with false information. This talk will be based on unique survey on social media users during election campaigns in the Philippines, Thailand, Malaysia and Indonesia. Findings from the surveys have shown that users have varying levels of trust in information on Facebook. Socioeconomic status,

demography and offline political participation seem to influence have much trust users have on Facebook. This cross-country analysis demonstrates that focusing on technical factors, such as artificial intelligence, may not be sufficient to tackling the spread of disinformation without a more nuanced understanding of users' behavior.

Dr. Ritu Gill - Biography

Defence Scientist, Toronto Research Centre

Ritu Gill received her PhD in social psychology from Carleton University. She started her career as a research manager in the Research Branch, Correctional Service Canada. Currently she is a defence scientist at the Toronto Research Centre where she is the Team Leader of the Psychological Effects Team within the Joint Targeting Section. Her current research examines influence activities, specifically, social media as an influence capability tool, disinformation, and online target audience analysis. She serves on international defence research collaborations including ‘Understanding Influence’ with Sweden and Netherlands, as well as co-chair and Canadian representative for NATO HFM 293 ‘Social Media Assessment for Effective Communication and Cyber Diplomacy’.  She has served as peer reviewer for several academic journals including Women and Criminal Justice, as well as the American Psychological Association Journal of Personality and Social Psychology Bulletin.  Dr. Gill has also taught several introductory psychology courses at Carleton University.

Speech Abstract

Disinformation via Social Media

 

Dr. Gill examined disinformation via social media and its impact on militaries and societies by conducting interviews with CAF subject matter experts.  Specifically, given the pervasiveness and effectiveness of disinformation employed by adversaries on target audiences, particularly during major national events such as elections, and 2019 being election year for Canada, her study assessed several aspects of disinformation including i) identifying indicators of disinformation, ii) the impact of disinformation on military activities, iii) methods and strategies used to counter disinformation, and iv) how to inoculate the military and society to disinformation.  Results indicated that in order to effectively counter disinformation the focus needs to be on identifying the military’s core strategic narrative and reinforcing the larger narrative in all communications.  Tactical messages that are disseminated should be focused on supporting that larger strategic narrative.  Furthermore, in order to foster resilience to disinformation for militaries and the public, inoculation is key; inoculation can be attained through education as part of pre-deployment training for military, as well as public service announcements via traditional formats and through social media for the public, particularly during critical events such as national elections.  The results of this study are particularly timely and relevant given 2019 is election year for Canada, and interference has already occurred with 21, 600 troll tweets from suspected foreign influence campaigns on divisive issues, such as pipelines and immigration in Canada.

James F Rosie

Principal Anthropologist, Defence Science and Technology
Laboratory, UK MoD

James started his career with the British Army, including two operational deployments to Afghanistan and a posting with British Embassy, Beijing.  After leaving the Army, he joined DSTL; working on projects within the Information Operations space, such as human terrain and network analysis.  This was complemented by further deployments to Afghanistan as a civilian analyst. Following these, he took on a role as team leader for the Dstl Behavioural & Cultural Systems team, before seconding into UK Jt Force Command, as a Scientific Advisor, in 2015.  In April 2018 he returned to DSTL and is now a Principal Anthropologist and has spent the last year working on novel influence research, including developing a new capability in Behavioural Analytics  for the MoD. Recently James was elected as a fellow of the Royal Anthropological Institute.  

Speech Abstract

Behavioural Analytics: Bringing Together The Human & Data Sciences

 

Many are already talking about information overload, but all of the trends suggest that the volume of information and data available, to governments, organisations and individuals will continue to grow.  Although the growth in data science capabilities is apparent, there is far less focus on the application of human and behavioural sciences, whether anthropological perspectives on how we engage with data and technology, or cognitive psychology to understand the impact on our own decision making. This talks focuses on how we have started to develop a novel, fully inter-

disciplinary capability around data and human sciences, titled Behavioural Analytics and offers an opportunity to discuss some of the key lessons we have learnt over our first year.

JudithvdKuijt (2).jpg
Judith van de Kuijt
Researcher at TNO Defence, Security and Safety

Judith van de Kuijt studied Conflict Studies at the Radboud University and Military strategic Studies at the Netherlands Defence Academy. Since 2015 she works as a research scientist at the Netherlands Organisation for applied scientific research (TNO). There she currently works for the department Military Operations, conducting research that supports military and security operations. She has experience in behavioural influence, Information Operations, and Intelligence.
 

Speech Abstract
Disinformation in the Dutch context
The Netherlands Armed Forces depends on accurate information about, for example, the situation in mission areas and potential threats for an effective execution of their three main tasks (1. Contributing to peace, security and stability worldwide, 2. Protecting Dutch and Allied (NATO) territory, and 3. Providing assistance to civilians and civil authorities worldwide). In current military operations, there is an information overload but this does not at all necessarily mean that information always gives a factually correct representation of the reality. The current Information Environment (EI) has become increasingly contaminated as every individual can spread information and messages. Over the past years, it has become clear that disinformation is spread via social media by both state- and non-state actors (e.g. China, Russia and ISIS). Disinformation is not new. However, with the age of the Internet and the development of social media platforms a new medium to spread messages has been created and at the same time the ability to propagate disinformation. On the one hand, the Netherlands Armed Forces has to deal with the increasing importance of the Internet and social media as a source for information (e.g. mobilizing communities to provide assistance to humanitarian disaster relief). At the same time, social media provides fertile ground for disinformation and the manipulation of perceptions and attitudes, with potentially great impact for decision-making processes and the functioning of the armed forces. Moreover, adversaries effectively use disinformation as a tool in information warfare to undermine the credibility of operational missions. Given the pervasiveness and effectiveness of disinformation employed by adversaries on target audiences, such as the Netherlands in general and the Netherlands Armed Forces in particular, this presentation provides insight in several aspects of disinformation in the Dutch context. It discusses examples of disinformation campaigns against the Netherlands Armed Forces as well as the targets of these campaigns and several lines of operations during the eFP mission. It also assesses the impact of disinformation on military activities and how to inoculate the military to disinformation. It concludes with research conducted at TNO regarding online influence, fake news, and Artificial Intelligence (AI).
Ullrich Ecker
Kathleen
Nitin
Edson
Aim
Ritu
James
Judith
Ommer
Jason

rsvp

 

#StratComAPAC2019

contact us

victoriachua at ntu.edu.sg

jdauwels at ntu.edu.sg

Disclaimer Regarding Photo-Taking, Audio/Video Recording During Event:

By participating in the this event, you consent to photography and its release, publication, exhibition, or reproduction to be used for news, web casts, promotional purposes, telecasts, advertising, inclusion on web sites, or for any other purpose(s) that the organisers, its vendors, partners, affiliates and/or representatives deem fit to use. You release all persons involved from any liability connected with the taking, recording, digitising, or publication of photographs.

In addition, the event will be monitored for unauthorised recording. By attending the event or entering the event premises, you agree not to record or digitise any parts of the presentation and panels. If you attempt to use a recording device, you consent to your immediate removal from the premises and forfeiture of the device.

bottom of page