Now showing 1 - 3 of 3
  • Publication
    FinDEx: A Synthetic Data Sharing Platform for Financial Fraud Detection
    ( 2024-01-06)
    Fabian Sven Karst
    ;
    ;
    The rising number of financial frauds inflicted in the last year more than 800 billion USD in damages on the global economy. Although financial institutions possess advanced AI systems for fraud detection, the time required to accumulate a sufficient volume of fraudulent data for training models creates a costly vulnerability. Combined with the inability to share fraud detection training data among institutions due to data and privacy regulations, this poses a major challenge. To address this issue, we propose the concept of a synthetic data-sharing ecosystem platform (FinDEx). This platform ensures data anonymity by generating synthesized training data based on each institution's fraud detection datasets. Various synthetic data generation techniques are employed to rapidly construct a shared dataset for all ecosystem members. Using design science research, this paper leverages insights from financial fraud detection literature, data sharing practices, and modular systems theory to derive design knowledge for the platform architecture. Furthermore, the feasibility of using different data generation algorithms such as generative adversarial networks, variational auto encoder and Gaussian mixture model was evaluated and different methods for the integration of synthetic data into the training procedure were tested. Thus, contributing to the theory at the intersection between fraud detection and data sharing and providing practitioners with guidelines on how to design such systems.
    Type:
    Journal:
  • Publication
    Type:
    Journal:
  • Publication
    Combining Humans and Machine Learning: A Novel Approach for Evaluating Crowdsourcing Contributions in Idea Contests
    ( 2018)
    Dellermann, Dominik
    ;
    Lipusch, Nikolaus
    ;
    The creative potential from innovative contributions of the crowd constitutes some critical challenges. The quantity of contributions and the resource demands to identify valuable ideas is high and remains challenging for firms that apply open innovation initiatives. To solve these problems, research on algorithmic approaches proved to be a valuable way by identifying metrics to distinguish between high and low-quality ideas. However, such filtering approaches always risk missing promising ideas by classifying good ideas as bad ones. In response, organizations have turned to the crowd to not just for generating ideas but also to evaluate them to filter high quality contributions. However, such crowd-based filtering approaches tend to perform poorly in practice as they make unrealistic demands on the crowd. We, therefore, conduct a design science research project to provide prescriptive knowledge on how to combine machine learning techniques with crowd evaluation to adaptively assign humans to ideas.
    Type:
    Journal: