Archive Information Management
Kim van Oorschot
Paul A. Pavlou and Angelika Dimoka
Miklos A. Vasarhelyi
Ronald Spanjers, Patricio Alencar da Silva
Jon Pluyter, Khoa Nguyen
Yunwei Zhao, Yan Wang
Amal Elgammal, Maiara Cancian
Faiza Bukhsh, Jeewanie Jayasinghe Arachchige
November 26, 2012
Christos Nikolaou, University of Crete
On Competing Service Systems
When innovative new services are introduced to market and society, it is usually the case that a group of service providers (frequently through the leadership of a sector dominant provider) form an alliance that implements the new service as a composition of some of their own offerings (services or goods and resources). The participating service providers set their own business objectives in order to enter the coalition, and these will have to be satisfied if the emerging service system is to be successful against the competition and sustainable in the long run. One of the many interesting questions that emerge in this context is the following: how can these business objectives be translated to appropriate constraints for the design of the necessary business processes and for resource and infrastructure services (cloud) provisioning? How can these - structure of the service system, business processes, infrastructure services - be adapted to a changing and competitive business environment so that old or new business objectives are still met?
I will present some initial research results that can provide a first handle to these problems. I argue that we need a fresh, holistic approach to deal with them. We have to gain a deeper understanding of how value is created in service systems,how people and systems contribute to this creation, and how sensitive this value creation is to our, most of the time, unsuspecting technical decisions on IT systems and infrastructures. I will also discuss a first attempt towards a unified quantitative framework that could be useful towards addressing these issues and concerns.
October 5, 2012
Kim van Oorschot, Norwegian Business School
Pilot error? Managerial decision-biases against concurrency as explanation for delays in new aircraft development programmes
Henk Akkermans & Kim van Oorschot
The majority of major aircraft development programmes is very much delayed. This is not only attributable to the technical complexity of these projects. From the literature on safety and human error, we know that the majority of major incidents in dynamically complex settings, such as new aircraft development, is caused by human error. We also know that concurrency between design phases is an effective approach to facilitate team learning and therefore speed up project progress. However, human decision-making literature tells us that people become more risk-aversive in settings of high-uncertainty. Therefore, it seems plausible that in new aircraft development programmes, managers opt for less concurrency and that this choice can contribute to the overall project delay instead of preventing it. Based on system dynamics modeling, our research examines the impact of opting for less-than-normal concurrency between development stages in new aircraft development programmes, on overall project duration and costs. Our findings suggest that a greater degree of concurrency will lead to earlier starts of learning curves trajectories downstream, and to earlier feedback to upstream stages, and so lowering first-time-right quality standards actually, counter-intuitively, leads to higher quality levels in development being reached sooner, not later, with major effects on manufacturing completion schedules and aircraft sales later on.
September 27, 2012
Henk Akkermans, Tilburg University
Anatomy of a Decision Trap in Complex New Product Development Projects
Kim van Oorschot (BI Norwegian Business School), Henk Akkermans (Tilburg University), Kishore Sengupta (INSEAD), Luk van Wassenhove (INSEAD)
We conducted a longitudinal process study of one firm’s failed attempt to develop a new product. Our extensive analysis of the data suggests that teams in complex dynamic environments characterized by delays are subject to multiple information filters that blur their perception of actual project performance. Consequently, teams do not realize the project is in trouble and repeatedly fall into a decision trap of stretching current project stages at the expense of future stages. This slowly and gradually reduces the likelihood of project success. However, because of the information filters, teams fail to notice until it is too late.
September 6, 2012
Nilmini Wickramasinghe, RMIT University
From Idea to Realisation Key Success Factors for Enabling mHealth Solutions: the case of DiaMonD
To date the adoption and diffusion of technology enabled solutions to deliver better healthcare has been slow. There are many reasons for this. One of the most significant is that existing methodologies that are normally used in general for ICT implementations tend to be less successful in a healthcare context. This presentation will report upon a longitudinal study in which development of an appropriate business and delivery framework as well as a knowledge-based adaptive mapping to realization methodology were constructed as a means to traverse from idea to realization rapidly and yet without compromising rigor so that a unique and successful ICT enabled, patient centric value drive healthcare solution may ensue. The solution is discussed in connection with trying to implement superior wireless ICT enabled approach for facilitating superior chronic disease management.
Nilmini Wickramasinghe (PhD; MBA; GradDipMgtSt; BSc. Amus.A, piano; Amus.A, violin) researches and teaches in several areas within information systems including knowledge management, e-commerce and m-commerce, and organizational impacts of technology with particular focus on the applications of these areas to healthcare and thereby effecting superior healthcare delivery. She is well published in all these areas with more than 300 referred scholarly articles several books and an encyclopedia. In addition, she regularly presents her work throughout North America, as well as in Europe and Australasia. In addition, Professor Wickramasinghe is the editor-in-chief of two scholarly journals: International Journal of Networking and Virtual Organisations (IJNVO – www.inderscience.com/ijnvo ) and International Journal of Biomedical Engineering and Technology (IJBET- www.inderscience.com/ijbet) where she was also the journals founder, both journals are published by InderScience. Currently, Nilmini Wickramasinghe is the Professor at the School of Business Information Technology at RMIT University, Deputy head research, core member of the Health Innovation Research Institute at RMIT University and the inaugural Epworth chair in Health Information Management.
August 13, 2012
Rainer Alt, University of Leipzig
Design Options for Service Directories in Business Networks
Web services and service oriented architectures (SOA) are spreading in many organizations to achieve business interoperability of their intra- and inter-organizational business processes. SOA is based on the idea that service providers develop and publish web services via standardized interfaces in directories (registries) where the services will be found and bound by service consumers. While these registry structures have emerged into a standard for local SOA implementations, the question remains how service directories should be organized in a business network, i.e. when multiple companies with individual SOA solutions interact. This research develops a framework for the analysis of service directories in business networks and provides design options for combining separate and distributed service directories. These design options are based on the range, reach, and richness of web service markets in the business network. The framework is applied to two business network cases.
Keywords: service repository, service oriented architecture, web services, business network.
June 18, 2012 (Workshop)
Angelika Dimoka and Paul A. Pavlou, Temple University, Fox School of Business, Center for Neural Decision Making
How to Conduct a functional Magnetic Resonance (fMRI) Study in Social Science Research; Evidence from two fMRI studies
How to Conduct a functional Magnetic Resonance (fMRI) Study in Social Science Research This research outlines a set of guidelines for conducting functional Magnetic Resonance Imaging (fMRI) studies in social science research in general and also, accordingly, in Information Systems (IS) research. Given the increased interest in using neuroimaging tools across the social sciences, this study aims at specifying the key steps needed to conduct an fMRI study while ensuring that enough detail is provided to evaluate the methods and results. The outline of an fMRI study consists of four key steps: (1) formulating the research question, (2) designing the fMRI protocol, (3) analyzing fMRI data, and (4) interpreting and reporting fMRI results. These steps are described with an illustrative example of a published fMRI study on trust and distrust in this journal (Dimoka, 2010). The paper contributes to the methodological literature by (a) providing a set of guidelines for designing and conducting fMRI studies, (b) specifying methodological details that should be included in fMRI studies in academic venues, and (c) illustrating these practices with an exemplar fMRI study. Future directions for conducting high-quality fMRI studies in the social sciences are discussed.
Why Do Consumers Choose Similar Avatars in Online Shopping Environments? A NeuroImaging Study of Similarity-Attraction and Dissimilarity-Repulsion Virtual sales assistants with human-like interfaces (“avatars”) help consumers make better choices with less cognitive effort in online shopping environments. Evidence suggests that consumers prefer avatars that are similar to them in terms of ethnicity and gender. Two competing theories explain this phenomenon: similarity-attraction (consumers are attracted to similar avatars) and dissimilarity-repulsion (consumers are repulsed by dissimilar avatars). A neuroimaging study tests these two competing theories by identifying the neural correlates of similarity/dissimilarity using functional Magnetic Resonance Imaging (fMRI) while consumers assess avatars that vary in their ethnicity and gender. The fMRI results show activation in a brain area associated with utility (caudate nucleus) for similarity, only in men, and mainly due to ethnic similarity. In contrast, dissimilarity spawns activity in brain areas linked to negative emotions (amygdala, insular cortex) and social prejudice (Broadmann Areas 9 and 32), only in women, and mainly due to gender dissimilarity. While both men and women ultimately choose “matched” avatars, men do so because of utility derived primarily from ethnic similarity, while women because of emotionally-charged social prejudice primarily from gender dissimilarity. .
June 04, 2012 (Faculty Seminar)
Lynne Markus, Bentley University
Mortgaged Future: Did IT Have a Role in Creating the Financial Crisis
Information technology and the transparency it can bring are often cited as solutions to current and future global financial problems.
But could IT have also have had a role in causing the mortgage meltdown? Many explanation of the global financial crisis have been offered, including global capital flows, government intervention (or lack thereof), and gambling, greed, or fraud. This presentation examines how the use of information technology—in the form of decision support (e.g., automated underwriting), transaction processing and electronic data exchange, and new information-based financial products (e.g., adjustable rate mortgages, collateralized mortgage obligations, etc.)—may have contributed to the crisis. Proposals for enhancing technology to prevent future crises are discussed, as are implications for future research on large-scale sociotechnical systems.
M. Lynne Markus is the John W. Poduska, Sr. Professor of Information and Process Management at Bentley University. Professor Markus’s teaching, research, and consulting interests include enterprise and inter-enterprise systems, IT governance, and IT-enabled organization change. Her paper “Industry-wide IS Standardization as Collective Action: The Case of the US Residential Mortgage Industry” (MIS Quarterly, 2006, with Charles W. Steinfield, Rolf T. Wigand, and Gabe Minton) won three best paper awards. She was named Fellow of the Association for Information Systems in 2004 and received the AIS LEO Award for Exceptional Lifetime Achievement in Information Systems in 2008.
May 31, 2012
Doug Vogel, City University of Hong Kong, AIS President-elect
In Search of Technology to Support Learning Abstracts
Learning can now be supported by a myriad of technologies, all of which have both intended (and untended) effects at individual (as educators, students and citizens) as well as program and institutional levels. In this presentation, I will look at a number of research successes (and failures) with which I been personally involved over the past 30 years. Aspects of impact and pedagogy as well as institutional direction will be addressed. Implications for the future of technology to support learning and directions for future research will be suggested.
May 24, 2012
Tina Comes, Kit, Karlsruhe Institute of Technology
Designing distributed decision support systems for complex and uncertain situations
Strategic decisions are most often subject to complexity and uncertainty. The main dilemma for decision-makers in these situations is the following: the more complex and uncertain the problem is, the more the need for systematic and formalised support increases, but the use of these formal approaches forces decision-makers to make important assumptions about the information. This presentation focuses on situations, which defy the use of standardised methods and tools. It gives an outline of how a distributed system to scenario construction and evaluation can support decision-makers facing complexity and uncertainty. In situations of fundamental uncertainty, scenarios can be used as a basis for sense- and decision-making. In (discursive) scenario planning, however, time restrictions are often neglected, and further requirements such as coherence or consistency are not made explicit. To enable the use of scenarios in complex and time-bound situations, a distributed approachto scenario construction that brings together (locally dispersed) experts with different skills and competences is presented. By combining the scenario construction with techniques from decision analysis, the choice of robust alternatives, which perform sufficiently well for all eventualities, can be supported. Ideally, scenarios facilitate the exploration of the complete space of possible developments. Human experts or decision-mak¬ers can, however, only handle a small set of scenarios at a time. As the space of possible futures can hardly be modelled nor can it be explored adequately by using standard methods, alternative approaches to identify and use the most relevant scenarios for the decision at hand are discussed.
May 10, 2012
Carol Saunders, University of Central Florida
Information Systems Backsourcing: The Role of Network Governance
This presentation demonstrates how theory evolved from iterations of a case study on backsourcing, or taking back in-house assets, activities, and skills that are part of IS operations and were previously outsourced to one or more outside providers. The case study extends extant research on sourcing by examining outsourcing and backsourcing decision making for outsourcing clients that are part of a network of three or more organizations linked to achieve certain goals and governed by a network broker. Two types of networks are examined: lead organization networks that are based on common ownership and network administrative organization networks that are either mandated or set up by members themselves to coordinate or sustain their common activities. An exploratory multiple-case study methodology is used to investigate the role of network governance in the original outsourcing and subsequent backsourcing decisions. The four cases illustrate differences in sourcing decision making in organizations that are part of owned versus mandated networks of organizations, the role of the network brokers in backsourcing decision making, and the importance of power in owned and mandated networks. Several propositions on the role of network brokers, client executives and network competencies in sourcing decisions are presented. The presentation concludes with a discussion of how theory was derived over the life of the case study.
Carol Saunders is currently Professor of Management at the University of Central Florida. She is a LEO award winner for lifetime accomplishments to the IS discipline and an Association of Information Systems Fellow. She served on a number of editorial boards, including a three-year term as Editor-in-Chief of MIS Quarterly. She also served as General Conference Chair of the premier Information Systems (IS) conference, ICIS, as well as Telecommuting ’96. She helped found the Organization Communication and Information Systems division of the Academy of Management and served in a variety of positions including its program chair and division chair. She is spending her Spring 2012 sabbatical at the Universität Erlangen-Nürnberg where she was invited to work on her research on overload. She recently returned from Austria as the Distinguished Fulbright Scholar at the Wirtschafts Universität – Wien (WU), and earlier held a Professional Fulbright with the Malaysian Agricultural Research and Development Institute. She has held research chairs in New Zealand, Singapore, and the Netherlands. Her current research interests include overload, backsourcing, virtual teams, virtual worlds, and time. She has published in top-ranked Management, IS, Computer Science and Communication journals. She is currently on the editorial board of Organization Science and on the advisory board of Business and Information Systems Engineering .
March 20, 2012
Miklos A. Vasarhelyi, Rutgers University
Is the thief already out of the barn? Predictive and retroactive audit
Auditing has been retroactively oriented as a traditional discipline. Typically auditors review past corporate reports and vouch for their fair representativeness. Although this decreases agency moral hazard to a certain degree, it is of limited value to provide attestation substantially after the event. This presentation discusses bring auditing to closer to events, their potential reporting short time after the event (evergreen opinions) and performing "predictive audits" that examine both the validity of transactions before they are executed or compare actual to highly timely normative models. These normative models are illustrated by an example that uses machine learning techniques to perform the predictions.
March 12, 2012
James R. Taylor, University of Montreal
Innovation and the Authoring of the Large Organization, Why the Problem?
We are unaccustomed to thinking of it in this way, but the authorship of organization is not a privilege of some special group, such as a policy task force, or a strategy committee. Everybody is authoring their own organization, all the time; if they were not, there would be no organization. The authoring occurs at three levels. First, every individual understands his or her experience with others by formulating it as a personal narrative, or account. Second, as those same people collectively engage with each other to construct a shared world of practice, anchored in materiality, they interactively make sense not only of what is happening but come to identify their own place in the community, as well as its powers and its limitations. Third, until these disparate, typically geographically dispersed communities manage to organize themselves as a collectivity they possess no common organizational identity. But because this more extended establishment of organization is loosely coupled it can quickly develop settled patterns of interaction and distribution of privileges of authorship, typically inscribed in texts that are supposed to embody the established authority. The problem here, however is that the resultant hierarchy reflects only where the organization has been, not always where it is now. When the organization ventures into new domains of practice, typically as a consequence of technical innovation, it is the local communities who learn first, and it is they who develop new kinds of expertise and a different understanding of the organizational order. What happens to an organization when local expertise comes up against the established régimes of authority, when, in other words, expertise and corporate position contradict each other? That is our theme.
James R. Taylor, Ph.D., F.I.C.A, is the author, co-author or editor of eight books, including Une organization n'est qu'un tissu de communications: essais théoriques Taylor, J.R. (1988), Rethinking the theory of organizational communication, How to Read an Organization (1993), and (in collaboration), with M. Chevalier, The Dynamic of Adaptation in the Federal Administration (1970), with E. Van Every, The Vulnerable Fortress: Bureaucratic organization and management in the information age (1993), The Emergent Organization: Communication as its Site and Surface (2000) and The Situated Organization: Case Studies in the Pragmatics of Communication Research (2011). Other edited books include The computerization of work: A communication perspective (Taylor, Groleau, Heaton & Van Every, 2001) and Empirical explorations into the dynamic of text and conversation (Cooren, F., Taylor, J. R. & Van Every, E. J., eds, 2006). He is the author of more than ninety scientific articles, published in four languages, English, French, Portuguese and Spanish. He has received several “best paper” awards at ICA and NCA. He us a fellow of the International Communication Association, an Emeritus Professor and “Pioneer” of the Université de Montréal (as founder of its Department of Communication), he has also been voted outstanding member of the Organizational Communication Division of the International Communication Association.
March 9, 2012
Ronald Spanjers, Director of Finance and Information at Medisch Spectrum Twente
Be Patient: A longitudinal Study On Adoption And Diffusion Of Information Technology Innovation In Dutch Healthcare
In this presentation we introduce the complex field of healthcare and the adoption and diffusion information technology innovation (e-health). We present the results of a longitudinal case study and market analysis of a information technology innovation in in Dutch healthcare. A further investigation based on a round of interviews with a focus group of information technology management in Dutch hospitals, with the emphasis on elements of the information technology innovation decision making process, provides insight on how information technology alignment could be improved.
Patricio Alencar da Silva, Tilburg University
Managing Value Webs
A value constellation is a business ecosystem enclosing a set of actors exchanging objects of economic value towards a common goal of profit generation. Its sustainability iss sustainability is critically important for medium and small-sized participant enterprises, as it increase their chances to survive in competitive markets. The sustainability of a value constellation, though, depends not only on the networkability of its individual enterprises, but also on the manageability of its constituent transactions. How to manage a value constellation from a transactional point of view is a problem thatdemands careful considerations on thedesign of the Ontology and Architecture of the modern Enterprise. Starting from the ontological issues, we propose a framework comprising an ontology and method for strategically managingvalue constellations. The core logic consists on modeling management as a communication behavior, which manifests from the micro-context of single business transactions tothe macro-context of entire value constellations providing managing services to one another. The approachhas been evaluated ona business case inelectricity markets, where it is demonstrated how a smart metering constellation can supply an electricity balancing one with monitoring and adaptation services. The feasibility analysis guides a business analyst on deriving multiple strategies to manage a value constellation and selecting the (potentially) least payoff one.
Keywords: Enterprise Ontology, Electricity Markets, Requirements Engineering, Strategic Management, Value Constellations.
February 16, 2012
Jon Pluyter, Tilburg University
“Don’t shoot the messenger!” IT-relatedoverloadin the operatingroom
The surgical team is bombarded with Information Technology (IT) bringing large amounts of information into the Operating Room (OR). The pace of introducing new IT is increasing rapidly as new treatments require new IT. IT-related overload is reported in literature and by the medical staff in a set of interviews. Potential costs of failure in the OR are high, including both cognitive (detrimental surgical performance, medical errors) and emotional (stress, mental burden) consequences of IT-related overload. Such factors may also influence post-adoptivebehavior of the surgical team such as user satisfaction. The general lack of theoretical understanding of IT-related overload also inhibits management to take appropriate interventions including investment in IT-based training curricula, and medical IT with low overload potential. The primary goal of this research is to enlighten the relationships between training and IT-related overload to improve safety in the OR. The research will draw a scientific understanding of the potential adverse effects of IT on cognitive load at the individual and dyadic level. Particular focus will be given to processes of attention and absorption with IT. The secondary goal of the research is to elaborate on the validation of the use of virtual reality simulation technologyto train surgeons under such cognitive constraints imposed by IT in the OR. The theoretical goal is to develop and validate self-reported psychometric scales useful to understand overload with IT in the OR using a multi-disciplinary approach. Such understanding is aimed to improve management of medical IT and design of surgical education and training. Findings can also contribute to theory driven design of medical IT taking into account the cognitive load IT puts on the medical staff.
Khoa Nguyen, Tilburg University
Blueprinting Approach in Support of Cloud Computing
Developing and deploying Service-based Applications (SBAs) using different Software-as-a-service (SaaS), Platform-as-a-service (PaaS) and Infrastructure-as-a-service (IaaS) offerings from multiple providers in the cloud is still not possible, since current cloud service offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of SBA developers to configure and syndicat e offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e. SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, a formal Blueprint Template has been provided for unambiguously describing a blueprint, and a Blueprint Manipulation Language has been developed for the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within the EC’s FP7 4caaSt project is reported and an associated blueprint prototype implementation is presented.
February 9, 2012
Yunwei Zhao, Tsinghua University
Intelligence Decision Architecture For Social-Cyber-Physical World
Analytics has generally been accepted to a key enabler to leverage significant improvements in business decision making in the cyber world and service provisioning. With the rapidly growing deployment of social networks and Web/Internet of Things (WoT/IoT), behavioral data may be continuously monitored and captured from both social and physical world. This can be used in real-time intelligent decisionmaking for business activities and service operations. In this paper, we would like to propose a system architecture framework, called the I-DeAr (Intelligence DecisionArchitecture), that allows merging the social world and physical world in the cyber space, while supporting real-time decision making based on the streaming behavioral data from both worlds. Details of the architecture design will be introduced, followed by the fundamental challenges in the intelligence decision making in such social-cyber-physical world. Finally, a case study will be presented to illustrate the feasibility and practicability of our proposal.
Yan Wang, Tilburg University
Towardsa Simulation Based Modeling Framework for Service Networks
In the era of service economy, the dominant logic of marketing has been shifted from tangible resources, embedded value, and transactions to intangible resources, the co-creation of value, and relationships. The world face of electronic services has transformed from object orientation into service orientation, with networked enterprises transacting and co-creating value on digital infrastructures with a global reach. These networked enterprises demand innovative service systems and networks to advance their business and compete in increasingly complex and dynamic markets and operating environments. The starting point to the success of developing a service network is to have a comprehensive picture of the process in which all the required services are delivered and all the stakeholders are involved, and in addition, to find a novel way to explain the picture to both business and technicalexperts. The demands at service networks place an equal emphasis on the business domain and technical domain. The business process design should fully reflect the business goals and the interactions among multiple stakeholders involved in the entire business chain. It also should be translated into the technical design without any misunderstanding, so that the necessary technical expertise and resources can be dedicated to the service network and used in the most effective and cost-efficient way. Thus it is crucial to have smooth communication between business modelers and technical modelers, and an alignment between business design and Information Technology (IT), through which the maximal value from IT will be added to the business. This research has the purpose to provide a framework, which will be an effective solution to service network development and performance optimization. The service networks will be represented and analyzed from multiple perspectives in favor of both top management and IT developers. System dynamics is adopted as the modelling approach to predict, analyze and visualize the impact of changes in service networks over time, and trace back to the root-cause of performance anomaliesand errors. Performance measurement playsan important role in this research. In the anticipated framework, the service capability and characteristics are clearly specified, and the influential factors to the service performance, such as QoSs of software services and the KPIs of business services, are distinctively derived. Thus the service network performance is accurately characterized and consistently communicated in the entire network. The research will advocate creating an explicit service network structure and mechanisms for measuring and analyzing service network performance. Eventually, the research result will bridge the gap between the business and IT parties in practical service networks, and the gap among different scientific disciplines in service science.
January 12, 2012
Amal Elgammal, Tilburg University
Towards a Comprehensive Framework for Business Process Compliance
Today's enterprises demand a high degree of compliance of business processes to meet regulations, such as Sarbanes-Oxley and Basel II. Compliance should be enforced during all phases of business process lifecycle, from the phases of analysis and design to deployment, monitoring and evaluation. This course of research primarily concentrates on design-time aspects of compliance management and secondarily on business process runtime monitoring; hence, providing lifetime compliance support. While current manual or ad-hoc solutions provide limited assurances that business processes adhere to relevant constraints, there is still a lack of an established generic framework for managing these constraints; integrating their relationships and maintaining their traceability to sources and processes; and automatically verifying their satisfiability. As one of the key steps towards a comprehensive compliance management framework, we propose a compliance conceptual model for refining, specifying and managing compliance specifics and integrating them with business processes. Furthermore, we propose the Compliance Request Language (CRL), grounded on temporal logic and compliance patterns, for the formal specification of compliance requirements. This enables design-time automated compliance verification and analysis. Moreover, CRL supports the specification of non-monotonic requirements, which enables the relaxation of some compliance requirements to handle exceptional situations. As a complementary `step, we also propose a root-cause analysis approach to reason about and analyze detected design-time compliance violations that can aid the user to resolve compliance deviations. Finally, based on CRL and XPath, the subsequent runtime monitoring of the running business process instances is conducted to detect violations during business process executions.
Maiara Heil Cancian, Federal University of Santa Catarina
A Capability/Maturity Model for Process Improvement to Collaborative SaaS
SaaS (Software-as-a-Service) brings forth a set of advantages that have attracted providers and clients due to the facilities provided by cloud computing. Their adoption requires, however, many changes in the involved organizations as well as several issues to be dealt with. One of the issues refers to the reliability and trustworthiness of SaaS-based services. One essential problem is related to how SaaS clients - and services software providers that want to work with each other - can trust on the general services’ quality when services are provided by many and not previously known providers? A number of approaches can be used to face this. This PhD Thesis intends to exploit the use of certification of providers within such ubiquitous scenario, assuming (hypothesis) that services’ trustworthiness can be increased if clients (and sometimes other providers) can verify andassure the way (processes) services are developed.
January 12, 2012
Faiza Bukhsh, Tilburg University
Extended Single Window: Service Oriented Auditing
In international trade, reliability, security and cost effective logistic chain management are very important challenges that can only be met by innovative usages of IT. The Extended Single Window project aims at a drastic reduction of physical inspections of goods in main ports by coordinated planning of government authorities, reliable transport to and from hinterland hubs and administrative cost reduction. Some savings can be achieved by improving the distribution ofdata, but more is possible by reengineering the inspection processes from an audit perspective. The main question is then: "who is auditing what and where"?
Jeewanie Jayasinghe Arachchige, Tilburg University
Unified Modeling Framework for Service Design
Service-oriented architectures are the upcoming business standard for realizing enterprise information systems, thus creating a need for analysis and design methods that are truly service-oriented. Most research on this topic so far takes a strict software engineering perspective. For a proper alignment between the business and the IT, a service perspective at thebusinesslevel is needed as well. In this research, themain focus is alignment of the business services with software services. This alignment is going to be achieved by a framework that starts service design by identifying services using a new modeling language –BSRM (businessservice andresource modeling language). BSRM follows the MDA approach and it is based on established business ontology - REA. This framework will encourage the system designers and developers not tostart their designs from sketch but from the service-oriented business models and patterns already developed. The research outcome would leverage benefits to both technicaland business users.
January 11, 2012
Guido Governatori, The NICTA Queensland Research Laboratory
A Logic Based Approach toBusiness Process Compliance
In this presentation we firstintroduce the notion of business process compliance as the alignment of the (formal) specifications for business processes and the (formal) specifications describing the regulatory framework governing the field of the processes. In the second part wediscuss some options to model the regulatory components, and we propose a particular logic designed to represent norms and policies. Finally, we show how to use the logic to check whether a process is compliant.