Publications

This is a complete list of publications I have authored or co-authored. You can find my h-index and i10-index information at Google Scholar. Clicking on the text of an entry will show you the BibTeX citation for such an entry, together with a download link, if one is available.

2017

Milos Cernak, Alain Komaty, Amir Mohammadi, André Anjos, and Sébastien Marcel. Bob speaks kaldi. In Proceedings of Interspeech. August 2017. URL: https://publications.idiap.ch/index.php/publications/show/3623.
Article:
@inproceedings{interspeech-2017,
    author = "Cernak, Milos and Komaty, Alain and Mohammadi, Amir and Anjos, André and Marcel, Sébastien",
    month = "August",
    title = "Bob Speaks Kaldi",
    booktitle = "Proceedings of Interspeech",
    year = "2017",
    url = "https://publications.idiap.ch/index.php/publications/show/3623",
    pdf = "https://www.idiap.ch/~aanjos/papers/interspeech-2017.pdf",
    abstract = "This paper introduces and demonstrates Kaldi integration into Bob signal-processing and machine learning toolbox. The motivation for this integration is two-fold. Firstly, Bob benefits from using advanced speech processing tools developed in Kaldi. Secondly, Kaldi benefits from using complementary Bob modules, such as modulation-based VAD with an adaptive thresholding. In addition, Bob is designed as an open science tool, and this integration might offer to the Kaldi speech community a framework for better reproducibility of state-of-the-art research results."
}
André Anjos, Manuel Günther, Tiago de Freitas Pereira, Pavel Korshunov, Amir Mohammadi, and Sébastien Marcel. Continuously reproducing toolchains in pattern recognition and machine learning experiments. In Thirty-fourth International Conference on Machine Learning. August 2017. URL: https://publications.idiap.ch/index.php/publications/show/3666.
Article:  | Poster:
@inproceedings{icml-2017-2,
    author = "Anjos, André and Günther, Manuel and de Freitas Pereira, Tiago and Korshunov, Pavel and Mohammadi, Amir and Marcel, Sébastien",
    month = "August",
    title = "Continuously Reproducing Toolchains in Pattern Recognition and Machine Learning Experiments",
    booktitle = "Thirty-fourth International Conference on Machine Learning",
    year = "2017",
    location = "Sidney, Australia",
    url = "https://publications.idiap.ch/index.php/publications/show/3666",
    pdf = "https://www.idiap.ch/~aanjos/papers/icml-2017-2.pdf",
    poster = "https://www.idiap.ch/~aanjos/posters/icml-2017-2.pdf",
    abstract = "Pattern recognition and machine learning research work often contains experimental results on real-world data, which corroborates hypotheses and provides a canvas for the development and comparison of new ideas. Results, in this context, are typically summarized as a set of tables and figures, allowing the comparison of various methods, highlighting the advantages of the proposed ideas. Unfortunately, result reproducibility is often an overlooked feature of original research publications, competitions, or benchmark evaluations. The main reason for such a gap is the complexity on the development of software associated with these reports. Software frameworks are difficult to install, maintain, and distribute, while scientific experiments often consist of many steps and parameters that are difficult to report. The increasingly rising complexity of research challenges make it even more difficult to reproduce experiments and results. In this paper, we emphasize that a reproducible research work should be repeatable, shareable, extensible, and stable, and discuss important lessons we learned in creating, distributing, and maintaining software and data for reproducible research in pattern recognition and machine learning. We focus on a specific use-case of face recognition and describe in details how we can make the recognition experiments reproducible in practice."
}
André Anjos, Laurent El Shafey, and Sébastien Marcel. Beat: an open-science web platform. In Thirty-fourth International Conference on Machine Learning. August 2017. URL: https://publications.idiap.ch/index.php/publications/show/3665.
Article:  | Poster:
@inproceedings{icml-2017-1,
    author = "Anjos, André and El Shafey, Laurent and Marcel, Sébastien",
    month = "August",
    title = "BEAT: An Open-Science Web Platform",
    booktitle = "Thirty-fourth International Conference on Machine Learning",
    year = "2017",
    location = "Sydney, Australia",
    url = "https://publications.idiap.ch/index.php/publications/show/3665",
    pdf = "https://www.idiap.ch/~aanjos/papers/icml-2017-1.pdf",
    poster = "https://www.idiap.ch/~aanjos/posters/icml-2017-1.pdf",
    abstract = "With the increased interest in computational sciences, machine learning (ML), pattern recognition (PR) and big data, governmental agencies, academia and manufacturers are overwhelmed by the constant influx of new algorithms and techniques promising improved performance, generalization and robustness. Sadly, result reproducibility is often an overlooked feature accompanying original research publications, competitions and benchmark evaluations. The main reasons behind such a gap arise from natural complications in research and development in this area: the distribution of data may be a sensitive issue; software frameworks are difficult to install and maintain; Test protocols may involve a potentially large set of intricate steps which are difficult to handle. To bridge this gap, we built an open platform for research in computational sciences related to pattern recognition and machine learning, to help on the development, reproducibility and certification of results obtained in the field. By making use of such a system, academic, governmental or industrial organizations enable users to easily and socially develop processing toolchains, re-use data, algorithms, workflows and compare results from distinct algorithms and/or parameterizations with minimal effort. This article presents such a platform and discusses some of its key features, uses and limitations. We overview a currently operational prototype and provide design insights."
}
André Anjos, Laurent El-Shafey, and Sébastien Marcel. Beat: an open-source web-based open-science platform. April 2017. URL: https://arxiv.org/abs/1704.02319, arXiv:1704.02319.
Article:
@misc{arxiv-2017,
    author = "Anjos, André and El-Shafey, Laurent and Marcel, Sébastien",
    title = "BEAT: An Open-Source Web-Based Open-Science Platform",
    year = "2017",
    month = "April",
    archivePrefix = "arXiv",
    eprint = "1704.02319",
    primaryClass = "cs-se",
    url = "https://arxiv.org/abs/1704.02319",
    pdf = "https://www.idiap.ch/~aanjos/papers/arxiv-2017.pdf",
    abstract = "With the increased interest in computational sciences, machine learning (ML), pattern recognition (PR) and big data, governmental agencies, academia and manufacturers are overwhelmed by the constant influx of new algorithms and techniques promising improved performance, generalization and robustness. Sadly, result reproducibility is often an overlooked feature accompanying original research publications, competitions and benchmark evaluations. The main reasons behind such a gap arise from natural complications in research and development in this area: the distribution of data may be a sensitive issue; software frameworks are difficult to install and maintain; Test protocols may involve a potentially large set of intricate steps which are difficult to handle. Given the raising complexity of research challenges and the constant increase in data volume, the conditions for achieving reproducible research in the domain are also increasingly difficult to meet. To bridge this gap, we built an open platform for research in computational sciences related to pattern recognition and machine learning, to help on the development, reproducibility and certification of results obtained in the field. By making use of such a system, academic, governmental or industrial organizations enable users to easily and socially develop processing toolchains, re-use data, algorithms, workflows and compare results from distinct algorithms and/or parameterizations with minimal effort. This article presents such a platform and discusses some of its key features, uses and limitations. We overview a currently operational prototype and provide design insights."
}

2016

Aythami Morales, Julian Fierrez, Ruben Tolosana, Javier Ortega-Garcia, Javier Galbally, Marta Gomez-Barrero, André Anjos, and Sébastien Marcel. Keystroke biometrics ongoing competition. IEEE Access, 4:7736–7746, November 2016. doi:10.1109/ACCESS.2016.2626718.
Article:
@article{ieee-access-2016,
    author = "Morales, Aythami and Fierrez, Julian and Tolosana, Ruben and Ortega-Garcia, Javier and Galbally, Javier and Gomez-Barrero, Marta and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
    month = "November",
    title = "Keystroke Biometrics Ongoing Competition",
    journal = "IEEE Access",
    volume = "4",
    year = "2016",
    pages = "7736--7746",
    issn = "2169-3536",
    doi = "10.1109/ACCESS.2016.2626718",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-access-2016.pdf",
    abstract = "This paper presents the first Keystroke Biometrics Ongoing Competition (KBOC) organized to establish a reproducible baseline in person authentication using keystroke biometrics. The competition has been developed using the BEAT platform and includes one of the largest keystroke databases publicly available based on a fixed text scenario. The database includes genuine and attacker keystroke sequences from 300 users acquired in 4 different sessions distributed in a four month time span. The sequences correspond to the user's name and surname and therefore each user comprises an individual and personal sequence. As baseline for KBOC we report the results of 31 different algorithms evaluated according to performance and robustness. The systems have achieved EERs as low as 5.32\\% and high robustness against multisession variability with drop of performances lower than 1\\% for probes separated by months. The entire database is publicly available at the competition website."
}
Ivana Chingovska, Nesli Erdogmus, André Anjos, and Sébastien Marcel. Face recognition systems under spoofing attacks. In Face Recognition Systems Under Spoofing Attacks, chapter 8, pages 165–194. Springer International Publishing, 1st edition edition, February 2016. doi:10.1007/978-3-319-28501-6_8.
@incollection{face-spoof-2016,
    author = "Chingovska, Ivana and Erdogmus, Nesli and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
    month = "February",
    title = "Face Recognition Systems Under Spoofing Attacks",
    booktitle = "Face Recognition Systems Under Spoofing Attacks",
    edition = "1st edition",
    chapter = "8",
    year = "2016",
    pages = "165--194",
    publisher = "Springer International Publishing",
    doi = "10.1007/978-3-319-28501-6\_8",
    abstract = "In this chapter, we give an overview of spoofing attacks and spoofing countermeasures for face recognition systems , with a focus on visual spectrum systems (VIS) in 2D and 3D, as well as near-infrared (NIR) and multispectral systems . We cover the existing types of spoofing attacks and report on their success to bypass several state-of-the-art face recognition systems. The results on two different face spoofing databases in VIS and one newly developed face spoofing database in NIR show that spoofing attacks present a significant security risk for face recognition systems in any part of the spectrum. The risk is partially reduced when using multispectral systems. We also give a systematic overview of the existing anti-spoofing techniques, with an analysis of their advantages and limitations and prospective for future work."
}

2015

Ivana Chingovska and André Anjos. On the use of client identity information for face anti-spoofing. IEEE Transactions on Information Forensics and Security, Special Issue on Biometric Anti-spoofing, 10(4):787–796, February 2015. doi:10.1109/TIFS.2015.2400392.
Article:
@article{tifs-2015,
    author = "Chingovska, Ivana and Anjos, Andr{\'{e}}",
    keywords = "Biometric Verification, Counter-Measures, Counter-Spoofing, Liveness Detection, Replay, Spoofing Attacks",
    title = "On the use of client identity information for face anti-spoofing",
    journal = "IEEE Transactions on Information Forensics and Security, Special Issue on Biometric Anti-spoofing",
    volume = "10",
    number = "4",
    month = "February",
    year = "2015",
    pages = "787--796",
    doi = "10.1109/TIFS.2015.2400392",
    pdf = "https://www.idiap.ch/~aanjos/papers/tifs-2015.pdf",
    abstract = "With biometrics playing the role of a password which can not be replaced if stolen, the necessity of establishing counter-measures to biometric spoofing attacks has been recognized. Regardless of the biometric mode, the typical approach of anti-spoofing systems is to classify biometric evidence based on features discriminating between real accesses and spoofing attacks. For the first time, to the best of our knowledge, this paper studies the amount of client-specific information within these features and how it affects the performance of anti-spoofing systems. We make use of this information to build two client-specific anti-spoofing solutions, one relying on a generative and another one on a discriminative paradigm. The proposed methods, tested on a set of state-of-the-art anti-spoofing features for the face mode, outperform the client-independent approaches with up to 50\\% relative improvement and exhibit better generalization capabilities on unseen types of spoofing attacks."
}

2014

Ivana Chingovska, André Anjos, and Sébastien Marcel. Biometrics evaluation under spoofing attacks. IEEE Transactions on Information, Forensics and Security, August 2014. doi:10.1109/TIFS.2014.2349158.
Article:
@article{tifs-2014,
    author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
    title = "Biometrics Evaluation Under Spoofing Attacks",
    journal = "IEEE Transactions on Information, Forensics and Security",
    year = "2014",
    month = "August",
    volume = "9",
    number = "12",
    doi = "10.1109/TIFS.2014.2349158",
    pdf = "https://www.idiap.ch/~aanjos/papers/tifs-2014.pdf",
    abstract = "While more accurate and reliable than ever, the trustworthiness of biometric verification systems is compromised by the emergence of spoofing attacks. Responding to this threat, numerous research publications address isolated spoofing detection, resulting in efficient counter-measures for many biometric modes. However, an important, but often overlooked issue regards their engagement into a verification task and how to measure their impact on the verification systems themselves. A novel evaluation framework for verification systems under spoofing attacks, called Expected Performance and Spoofability (EPS) framework, is the major contribution of this paper. Its purpose is to serve for an objective comparison of different verification systems with regards to their verification performance and vulnerability to spoofing, taking into account the system’s application-dependent susceptibility to spoofing attacks and cost of the errors. The convenience of the proposed open-source framework is demonstrated for the face mode, by comparing the security guarantee of four baseline face verification systems before and after they are secured with anti-spoofing algorithms."
}
Stan Z.Li, Javier Galbally, André Anjos, and Sébastien Marcel. Evaluation databases. In Sébastien Marcel, Mark Nixon, and Stan Z.Li, editors, Handbook of Biometric Anti-Spoofing, chapter Appendix A, pages 247–278. Springer-Verlag, 2014. doi:10.1007/978-1-4471-6524-8.
@incollection{hopad-2014-3,
    author = "Z.Li, Stan and Galbally, Javier and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
    editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Z.Li, Stan",
    title = "Evaluation Databases",
    booktitle = "Handbook of Biometric Anti-Spoofing",
    chapter = "Appendix A",
    year = "2014",
    pages = "247--278",
    publisher = "Springer-Verlag",
    isbn = "978-1-4471-6523-1",
    doi = "10.1007/978-1-4471-6524-8"
}
Ivana Chingovska, André Anjos, and Sébastien Marcel. Evaluation methodologies. In Sébastien Marcel, Mark Nixon, and Stan Z.Li, editors, Handbook of Biometric Anti-Spoofing, chapter 10, pages 185–204. Springer-Verlag, 2014. doi:10.1007/978-1-4471-6524-8_10.
@incollection{hopad-2014-2,
    author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
    editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Z.Li, Stan",
    title = "Evaluation Methodologies",
    chapter = "10",
    pages = "185--204",
    booktitle = "Handbook of Biometric Anti-Spoofing",
    publisher = "Springer-Verlag",
    year = "2014",
    doi = "10.1007/978-1-4471-6524-8\_10",
    abstract = "Following the definition of the task of the anti-spoofing systems to discriminate between real accesses and spoofing attacks, anti-spoofing can be regarded as a binary classification problem. The spoofing databases and the evaluation methodologies for anti-spoofing systems most often comply to the standards for binary classification problems. However the anti-spoofing systems are not destined to work stand-alone, and their main purpose is to protect a verification system from spoofing attacks. In the process of combining the decision of an anti-spoofing and a recognition system, effects on the recognition performance can be expected. Therefore, it is important to analyze the problem of anti-spoofing under the umbrella of biometric recognition systems. This brings certain requirements in the database design, as well as adapted concepts for evaluation of biometric recognition systems under spoofing attacks."
}
André Anjos, Jukka Komulainen, Sébastien Marcel, Abdenour Hadid, and Matti Pietikainen. Face anti-spoofing: visual approach. In Sébastien Marcel, Mark Nixon, and Stan Z.Li, editors, Handbook of Biometric Anti-Spoofing, chapter 4, pages 65–82. Springer-Verlag, 2014. doi:10.1007/978-1-4471-6524-8_4.
@incollection{hopad-2014,
    author = "Anjos, André and Komulainen, Jukka and Marcel, Sébastien and Hadid, Abdenour and Pietikainen, Matti",
    editor = "Marcel, S{\'{e}}bastien and Nixon, Mark and Z.Li, Stan",
    title = "Face Anti-Spoofing: Visual Approach",
    chapter = "4",
    booktitle = "Handbook of Biometric Anti-Spoofing",
    publisher = "Springer-Verlag",
    pages = "65--82",
    year = "2014",
    doi = "10.1007/978-1-4471-6524-8\_4",
    abstract = "User authentication is an important step to protect information and in this regard face biometrics is advantageous. Face biometrics is natural, easy to use and less human-invasive. Unfortunately, recent work revealed that face biometrics is quite vulnerable to spoofing attacks. This chapter presents the different modalities of attacks to visual spectrum face recognition systems. We introduce public datasets for the evaluation of vulnerability of recognition systems and performance of counter-measures. Finally, we build a comprehensive view of anti-spoofing techniques for visual spectrum face recognition and provide an outlook of issues that remain unaddressed."
}
Tiago de Freitas Pereira, Jukka Komulainen, André Anjos, José Mario De Martino, Abdenour Hadid, Matti Pietikainen, and Sébastien Marcel. Face liveness detection using dynamic texture. EURASIP Journal on Image and Video Processing, January 2014. doi:10.1186/1687-5281-2014-2.
Article:
@article{eurasip-2014,
    author = "de Freitas Pereira, Tiago and Komulainen, Jukka and Anjos, André and De Martino, José Mario and Hadid, Abdenour and Pietikainen, Matti and Marcel, Sébastien",
    title = "Face liveness detection using dynamic texture",
    journal = "EURASIP Journal on Image and Video Processing",
    year = "2014",
    month = "January",
    doi = "10.1186/1687-5281-2014-2",
    volume = "2014:2",
    pdf = "https://www.idiap.ch/~aanjos/papers/eurasip-2014.pdf",
    abstract = "User authentication is an important step to protect information, and in this context, face biometrics is potentially advantageous. Face biometrics is natural, intuitive, easy to use, and less human-invasive. Unfortunately, recent work has revealed that face biometrics is vulnerable to spoofing attacks using cheap low-tech equipment. This paper introduces a novel and appealing approach to detect face spoofing using the spatiotemporal (dynamic texture) extensions of the highly popular local binary pattern operator. The key idea of the approach is to learn and detect the structure and the dynamics of the facial micro-textures that characterise real faces but not fake ones. We evaluated the approach with two publicly available databases (Replay-Attack Database and CASIA Face Anti-Spoofing Database). The results show that our approach performs better than state-of-the-art techniques following the provided evaluation protocols of each database."
}
André Anjos, Ivana Chingovska, and Sébastien Marcel. Anti-spoofing: face databases. In Stan Z.Li and Anil Jain, editors, Encyclopedia of Biometrics. Springer US, 2nd edition edition, 2014. doi:10.1007/978-3-642-27733-7_9212-2.
@incollection{eob-2014-2,
    author = "Anjos, Andr{\'{e}} and Chingovska, Ivana and Marcel, S{\'{e}}bastien",
    editor = "Z.Li, Stan and Jain, Anil",
    title = "Anti-Spoofing: Face Databases",
    booktitle = "Encyclopedia of Biometrics",
    edition = "2nd edition",
    year = "2014",
    publisher = "Springer US",
    isbn = "978-3-642-27733-7",
    doi = "10.1007/978-3-642-27733-7\_9212-2",
    abstract = "Datasets for the evaluation of face verification system vulnerabilities to spoofing attacks and for the evaluation of face spoofing countermeasures."
}
Ivana Chingovska, André Anjos, and Sébastien Marcel. Anti-spoofing: evaluation methodologies. In Stan Z.Li and Anil Jain, editors, Encyclopedia of Biometrics. Springer US, 2nd edition edition, 2014. doi:10.1007/978-3-642-27733-7.
@incollection{eob-2014,
    author = "Chingovska, Ivana and Anjos, Andr{\'{e}} and Marcel, S{\'{e}}bastien",
    editor = "Z.Li, Stan and Jain, Anil",
    title = "Anti-spoofing: Evaluation Methodologies",
    booktitle = "Encyclopedia of Biometrics",
    edition = "2nd edition",
    year = "2014",
    publisher = "Springer US",
    isbn = "978-3-642-27733-7",
    doi = "10.1007/978-3-642-27733-7",
    abstract = "Following the definition of the task of the anti-spoofing systems to discriminate between real accesses and spoofing attacks, anti-spoofing can be regarded as a binary classification problem. The spoofing databases and the evaluation methodologies for anti-spoofing systems most often comply to the standards for binary classification problems. However, the anti-spoofing systems are not destined to work stand-alone, and their main purpose is to protect a verification system from spoofing attacks. In the process of combining the decision of an anti-spoofing and a recognition system, effects on the recognition performance can be expected. Therefore, it is important to analyze the problem of anti-spoofing under the umbrella of biometric recognition systems. This brings certain requirements in the database design, as well as adapted concepts for evaluation of biometric recognition systems under spoofing attacks."
}

2013

André Anjos, Murali Mohan Chakka, and Sébastien Marcel. Motion-based counter-measures to photo attacks in face recognition. IET Biometrics, July 2013. doi:10.1049/iet-bmt.2012.0071.
Article:
@article{iet-biometrics-2013,
    author = "Anjos, André and Chakka, Murali Mohan and Marcel, Sébastien",
    title = "Motion-Based Counter-Measures to Photo Attacks in Face Recognition",
    journal = "IET Biometrics",
    year = "2013",
    month = "July",
    pdf = "https://www.idiap.ch/~aanjos/papers/iet-biometrics-2013.pdf",
    doi = "10.1049/iet-bmt.2012.0071",
    abstract = "Identity spoofing is a contender for high-security face recognition applications. With the advent of social media and globalized search, our face images and videos are wide-spread on the internet and can be potentially used to attack biometric systems without previous user consent. Yet, research to counter these threats is just on its infancy - we lack public standard databases, protocols to measure spoofing vulnerability and baseline methods to detect these attacks. The contributions of this work to the area are three-fold: firstly we introduce a publicly available PHOTO-ATTACK database with associated protocols to measure the effectiveness of counter-measures. Based on the data available, we conduct a study on current state-of-the-art spoofing detection algorithms based on motion analysis, showing they fail under the light of these new dataset. By last, we propose a new technique of counter-measure solely based on foreground/background motion correlation using Optical Flow that outperforms all other algorithms achieving nearly perfect scoring with an equal-error rate of 1.52\\% on the available test data. The source code leading to the reported results is made available for the replicability of findings in this article."
}
I. Chingovska, J. Yang, Z. Lei, D. Yi, S. Z. Li, O. Kähm, C. Glaser, N. Damer, A. Kuijper, A. Nouak, J. Komulainen, T. Pereira, S. Gupta, S. Khandelwal, S. Bansal, A. Rai, T. Krishna, D. Goyal, M.-A. Waris, H. Zhang, I. Ahmad, S. Kiranyaz, M. Gabbouj, R. Tronci, M. Pili, N. Sirena, F. Roli, J. Galbally, J. Fierrez, A. Pinto, H. Pedrini, W. S. Schwartz, A. Rocha, A. Anjos, and S. Marcel. The 2nd competition on counter measures to 2d face spoofing attacks. In International Conference on Biometrics 2013. June 2013. doi:10.1109/ICB.2013.6613026.
Article:
@inproceedings{icb-2013-3,
    author = "Chingovska, I. and Yang, J. and Lei, Z. and Yi, D. and Li, S. Z. and Kähm, O. and Glaser, C. and Damer, N. and Kuijper, A. and Nouak, A. and Komulainen, J. and Pereira, T. and Gupta, S. and Khandelwal, S. and Bansal, S. and Rai, A. and Krishna, T. and Goyal, D. and Waris, M.-A. and Zhang, H. and Ahmad, I. and Kiranyaz, S. and Gabbouj, M. and Tronci, R. and Pili, M. and Sirena, N. and Roli, F. and Galbally, J. and Fierrez, J. and Pinto, A. and Pedrini, H. and Schwartz, W. S. and Rocha, A. and Anjos, A. and Marcel, S.",
    title = "The 2nd Competition on Counter Measures to 2D Face Spoofing Attacks",
    booktitle = "International Conference on Biometrics 2013",
    month = "June",
    year = "2013",
    pdf = "https://www.idiap.ch/~aanjos/papers/icb-2013-3.pdf",
    doi = "10.1109/ICB.2013.6613026",
    abstract = "As a crucial security problem, anti-spoofing in biometrics, and particularly for the face modality, has achieved great progress in the recent years. Still, new threats arrive in form of better, more realistic and more sophisticated spoofing attacks. The objective of the 2nd Competition on Counter Measures to 2D Face Spoofing Attacks is to challenge researchers to create counter measures effectively detecting a variety of attacks. The submitted propositions are evaluated on the Replay-Attack database and the achieved results are presented in this paper."
}
Jukka Komulainen, Abdenour Hadid, Matti Pietikäinen, André Anjos, and Sébastien Marcel. Complementary countermeasures for detecting scenic face spoofing attacks. In International Conference on Biometrics 2013. June 2013. doi:10.1109/ICB.2013.6612968.
Article:
@inproceedings{icb-2013-2,
    author = "Komulainen, Jukka and Hadid, Abdenour and Pietikäinen, Matti and Anjos, André and Marcel, Sébastien",
    title = "Complementary Countermeasures for Detecting Scenic Face Spoofing Attacks",
    booktitle = "International Conference on Biometrics 2013",
    month = "June",
    year = "2013",
    pdf = "https://www.idiap.ch/~aanjos/papers/icb-2013-2.pdf",
    doi = "10.1109/ICB.2013.6612968",
    abstract = "The face recognition community has finally started paying more attention to the long-neglected problem of spoofing attacks. The number of countermeasures is gradually increasing and fairly good results have been reported on the publicly available databases. There exists no superior anti-spoofing technique due to the varying nature of attack scenarios and acquisition conditions. Therefore, it is important to find out complementary countermeasures and study how they should be combined in order to construct an easily extensible anti-spoofing framework. In this paper, we address this issue by studying fusion of motion and texture based countermeasures under several types of scenic face attacks. We provide an intuitive way to explore the fusion potential of different visual cues and show that the performance of the individual methods can be vastly improved by performing fusion at score level. The Half-Total Error Rate (HTER) of the best individual countermeasure was decreased from 11.2\\% to 5.1\\% on the Replay Attack Database. More importantly, we question the idea of using complex classification schemes in individual countermeasures, since nearly same fusion performance is obtained by replacing them with a simple linear one. In this manner, the computational efficiency and also probably the generalization ability of the resulting anti-spoofing framework are increased."
}
Tiago de Freitas Pereira, André Anjos, José Mario De Martino, and Sébastien Marcel. Can face anti-spoofing countermeasures work in a real world scenario? In International Conference on Biometrics 2013. June 2013. doi:10.1109/ICB.2013.6612981.
Article:
@inproceedings{icb-2013-1,
    author = "de Freitas Pereira, Tiago and Anjos, André and Martino, José Mario De and Marcel, Sébastien",
    title = "Can face anti-spoofing countermeasures work in a real world scenario?",
    booktitle = "International Conference on Biometrics 2013",
    month = "June",
    year = "2013",
    doi = "10.1109/ICB.2013.6612981",
    pdf = "https://www.idiap.ch/~aanjos/papers/icb-2013-1.pdf",
    abstract = "User authentication is an important step to protect in- formation and in this field face biometrics is advantageous. Face biometrics is natural, easy to use and less human-invasive. Unfortunately, recent work has revealed that face biometrics is vulnerable to spoofing attacks using low-tech equipments. This article assesses how well existing face anti-spoofing countermeasures can work in a more realistic condition. Experiments carried out with two freely available video databases (Replay Attack Database and CASIA Face Anti-Spoofing Database) show low generalization and possible database bias in the evaluated countermeasures. To generalize and deal with the diversity of attacks in a real world scenario we introduce two strategies that show promising results."
}
Ivana Chingovska, André Anjos, and Sébastien Marcel. Anti-spoofing in action: joint operation with a verification system. In Computer Vision and Pattern Recognition Conference - Biometrics Workshop. June 2013. doi:10.1109/CVPRW.2013.22.
Article:
@inproceedings{cvpr-bw-2013,
    author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
    title = "Anti-spoofing in action: joint operation with a verification system",
    booktitle = "Computer Vision and Pattern Recognition Conference - Biometrics Workshop",
    year = "2013",
    doi = "10.1109/CVPRW.2013.22",
    month = "June",
    pdf = "https://www.idiap.ch/~aanjos/papers/cvpr-bw-2013.pdf",
    abstract = "Besides the recognition task, today’s biometric systems need to cope with additional problem: spoofing attacks. Up to date, academic research considers spoofing as a binary classification problem: systems are trained to discriminate between real accesses and attacks. However, spoofing counter-measures are not designated to operate stand-alone, but as a part of a recognition system they will protect. In this paper, we study techniques for decisionlevel and score-level fusion to integrate a recognition and anti-spoofing systems, using an open-source framework that handles the ternary classification problem (clients, impostors and attacks) transparently. By doing so, we are able to report the impact of different spoofing counter-measures, fusion techniques and thresholding on the overall performance of the final recognition system. For a specific use case covering face verification, experiments show to what extent simple fusion improves the trustworthiness of the system when exposed to spoofing attacks."
}

2012

Ivana Chingovska, André Anjos, and Sébastien Marcel. On the effectiveness of local binary patterns in face anti-spoofing. In IEEE International Conference of the Biometrics Special Interest Group. 2012.
Article:
@inproceedings{biosig-2012,
    author = "Chingovska, Ivana and Anjos, André and Marcel, Sébastien",
    title = "On the Effectiveness of Local Binary Patterns in Face Anti-spoofing",
    booktitle = "IEEE International Conference of the Biometrics Special Interest Group",
    year = "2012",
    pdf = "https://www.idiap.ch/~aanjos/papers/biosig-2012.pdf",
    isbn = "978-3-88579-290-1",
    abstract = "Spoofing attacks are one of the security traits that biometric recognition systems are proven to be vulnerable to. When spoofed, a biometric recognition system is bypassed by presenting a copy of the biometric evidence of a valid user. Among all biometric modalities, spoofing a face recognition system is particularly easy to perform: all that is needed is a simple photograph of the user. In this paper, we address the problem of detecting face spoofing attacks. In particular, we inspect the potential of texture features based on Local Binary Patterns (LBP) and their variations on three types of attacks: printed photographs, and photos and videos displayed on electronic screens of different sizes. For this purpose, we introduce REPLAY-ATTACK, a novel publicly available face spoofing database which contains all the mentioned types of attacks. We conclude that LBP, with ~15\\% Half Total Error Rate, show moderate discriminability when confronted with a wide set of attack types."
}
André Anjos, Laurent El Shafey, Roy Wallace, Manuel Günther, Chris McCool, and Sébastien Marcel. Bob: a free signal processing and machine learning toolbox for researchers. In ACM Multimedia 2012, 1449–1452. 2012. doi:10.1145/2393347.2396517.
Article:
@inproceedings{acmmm-2012,
    author = "Anjos, André and Shafey, Laurent El and Wallace, Roy and Günther, Manuel and McCool, Chris and Marcel, Sébastien",
    title = "Bob: a free signal processing and machine learning toolbox for researchers",
    booktitle = "ACM Multimedia 2012",
    year = "2012",
    pages = "1449--1452",
    pdf = "https://www.idiap.ch/~aanjos/papers/acmmm-2012.pdf",
    doi = "10.1145/2393347.2396517",
    abstract = "Bob is a free signal processing and machine learning toolbox originally developed by the Biometrics group at Idiap Research Institute, Switzerland. The toolbox is designed to meet the needs of researchers by reducing development time and efficiently processing data. Firstly, Bob provides a researcher-friendly Python environment for rapid development. Secondly, efficient processing of large amounts of multimedia data is provided by fast C++ implementations of identified bottlenecks. The Python environment is integrated seamlessly with the C++ library, which ensures the library is easy to use and extensible. Thirdly, Bob supports reproducible research through its integrated experimental protocols for several databases. Finally, a strong emphasis is placed on code clarity, documentation, and thorough unit testing. Bob is thus an attractive resource for researchers due to this unique combination of ease of use, efficiency, extensibility and transparency. Bob is an open-source library and an ongoing community effort."
}
Tiago de Freitas Pereira, André Anjos, José Mario De Martino, and Sébastien Marcel. Lbp-top based countermeasure against facial spoofing attacks. In International Workshop on Computer Vision With Local Binary Pattern Variants. 2012. doi:10.1007/978-3-642-37410-4_11.
Article:
@inproceedings{accv-2012,
    author = "de Freitas Pereira, Tiago and Anjos, André and Martino, José Mario De and Marcel, Sébastien",
    title = "LBP-TOP based countermeasure against facial spoofing attacks",
    booktitle = "International Workshop on Computer Vision With Local Binary Pattern Variants",
    year = "2012",
    doi = "10.1007/978-3-642-37410-4\_11",
    pdf = "https://www.idiap.ch/~aanjos/papers/accv-2012.pdf",
    abstract = "User authentication is an important step to protect informa- tion and in this field face biometrics is advantageous. Face biometrics is natural, easy to use and less human-invasive. Unfortunately, recent work has revealed that face biometrics is vulnerable to spoofing attacks using low-tech cheap equipments. This article presents a countermeasure against such attacks based on the LBP−TOP operator combining both space and time information into a single multiresolution texture descrip- tor. Experiments carried out with the REPLAY ATTACK database show a Half Total Error Rate (HTER) improvement from 15.16\\% to 7.60\\%."
}

2011

André Anjos and Sébastien Marcel. Counter-measures to photo attacks in face recognition: a public database and a baseline. In International Joint Conference on Biometrics 2011. October 2011. doi:10.1109/IJCB.2011.6117503.
Article:
@inproceedings{ijcb-2011-2,
    author = "Anjos, André and Marcel, Sébastien",
    title = "Counter-Measures to Photo Attacks in Face Recognition: a public database and a baseline",
    booktitle = "International Joint Conference on Biometrics 2011",
    year = "2011",
    month = "October",
    pdf = "https://www.idiap.ch/~aanjos/papers/ijcb-2011-2.pdf",
    doi = "10.1109/IJCB.2011.6117503",
    abstract = "A common technique to by-pass 2-D face recognition systems is to use photographs of spoofed identities. Unfortunately, research in counter-measures to this type of attack have not kept-up - even if such threats have been known for nearly a decade, there seems to exist no consensus on best practices, techniques or protocols for developing and testing spoofing-detectors for face recognition. We attribute the reason for this delay, partly, to the unavailability of public databases and protocols to study solutions and compare results. To this purpose we introduce the publicly available PRINT-ATTACK database and exemplify how to use its companion protocol with a motion-based algorithm that detects correlations between the person\'s head movements and the scene context. The results are to be used as basis for comparison to other counter-measure techniques. The PRINT-ATTACK database contains 200 videos of real-accesses and 200 videos of spoof attempts using printed photographs of 50 different identities."
}
Murali Mohan Chakka, André Anjos, Sébastien Marcel, and others. Competition on counter measures to 2-d facial spoofing attacks. In International Joint Conference on Biometrics 2011. October 2011. doi:10.1109/IJCB.2011.6117509.
Article:
@inproceedings{ijcb-2011,
    author = "Chakka, Murali Mohan and Anjos, André and Marcel, Sébastien and others",
    title = "Competition on Counter Measures to 2-D Facial Spoofing Attacks",
    booktitle = "International Joint Conference on Biometrics 2011",
    year = "2011",
    month = "October",
    doi = "10.1109/IJCB.2011.6117509",
    pdf = "https://www.idiap.ch/~aanjos/papers/ijcb-2011.pdf",
    abstract = "Spoofing identities using photographs is one of the most common techniques to attack 2-D face recognition systems. There seems to exist no comparative studies of different techniques using the same protocols and data. The motivation behind this competition is to compare the performance of different state-of-the-art algorithms on the same database using a unique evaluation method. Six different teams from universities around the world have participated in the contest. Use of one or multiple techniques from motion, texture analysis and liveness detection appears to be the common trend in this competition. Most of the algorithms are able to clearly separate spoof attempts from real accesses. The results suggest the investigation of more complex attacks."
}

2009

The ATLAS Collaboration. Configuration and control of the atlas trigger and data acquisition. In The 1st international conference on Technology and Instrumentation in Particle Physics. 2009.
Article:
@inproceedings{tipp-2009,
    author = "Collaboration, The ATLAS",
    title = "Configuration and Control of the ATLAS Trigger and Data Acquisition",
    booktitle = "The 1st international conference on Technology and Instrumentation in Particle Physics",
    year = "2009",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/tipp-2009.pdf",
    abstract = "ATLAS is the biggest of the experiments aimed at studying high-energy particle interactions at the Large Hadron Collider (LHC). This paper describes the evolution of the Controls and Configuration system of the ATLAS Trigger and Data Acquisition (TDAQ) from the Technical Design Report (TDR) in 2003 to the first events taken at CERN with circulating beams in autumn 2008. The present functionality and performance and the lessons learned during the development are outlined. At the end we will also highlight some of the challenges which still have to be met by 2010, when the full scale of the trigger farm will be deployed."
}
The ATLAS Collaboration. Atlas trigger and data acquisition: capabilities and commissioning. In 11th Pisa Meeting on Advanced Detectors on Frontier Detectors For Frontier Physics, La Biodola, Italy, 24 - 30 May 2009. 2009.
Article:
@inproceedings{pisa-2009,
    author = "Collaboration, The ATLAS",
    title = "ATLAS Trigger and Data Acquisition: capabilities and commissioning",
    booktitle = "11th Pisa Meeting on Advanced Detectors on Frontier Detectors For Frontier Physics, La Biodola, Italy, 24 - 30 May 2009",
    year = "2009",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/pisa-2009.pdf",
    abstract = "The ATLAS trigger system is based on three levels of event selection that selects the physics of interest from an initial bunch crossing rate of 40~MHz to an output rate of sim200~Hz compatible with the offline computing power and storage capacity. During nominal LHC operations at a luminosity of 1034~cm−2s−1, decisions must be taken every 25~ns. The LHC is expected to begin operations with a peak luminosity of 1031~cm−2s−1 with far fewer number of bunches, but quickly ramp up to higher luminosities. Hence, the ATLAS Trigger and Data Acquisition system needs to adapt to the changing beam conditions preserving the interesting physics and detector requirements that may vary with these conditions."
}
The ATLAS Collaboration. Atlas trigger for first physics and beyond. In Physics at LHC 2008 29 September - October 4, 2008 Split, Croatia. 2009.
Article:
@inproceedings{lhc-2009,
    author = "Collaboration, The ATLAS",
    title = "Atlas trigger for first physics and beyond",
    booktitle = "Physics at LHC 2008 29 September - October 4, 2008 Split, Croatia",
    year = "2009",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/lhc-2009.pdf",
    abstract = "ATLAS is a multi-purpose spectrometer built to perform precision measurements of Standard Model parameters and is aiming at discovery of Higgs particle, Super Symmetry and possible other physics channels beyond Standard Model. Operating at 14 TeV center of mass energy ATLAS will see 40 million events per second at nominal luminosity with about 25 overlapping interactions. Most of the events are inelastic proton-proton interactions with only few W, Z bosons or ttbar pairs produced each second, and expectations for Higgs or SUSY production cross-section are much smaller than that. ATLAS trigger has a difficult task to select one out of 10 5 events online and to ensure that most physics channels of interests are preserved for analysis. In this talk we will review the design of ATLAS trigger system, the trigger menu prepared for initial LHC run as well as for high luminosity run. The expected trigger performance of the base-line ATLAS physics programs will be reviewed and first results from the commissioning period will be given. The methods to measure trigger efficiencies and biases directly from data will be discussed."
}
R.C. Torres, A. Anjos, and J.M. Seixas. Automatizing the online filter test management for a general-purpose particle detector. Computer Physics Communications, October 2009.
Article:
@article{cpc-2009,
    author = "Torres, R.C. and Anjos, A. and Seixas, J.M.",
    title = "Automatizing the Online Filter Test Management for a General-Purpose Particle Detector",
    journal = "Computer Physics Communications",
    year = "2009",
    month = "October",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/cpc-2009.pdf",
    abstract = "This paper presents a software environment to automatically configure and run online triggering and dataflow farms for the ATLAS experiment at the Large Hadron Collider (LHC). It provides support for a broad set of users, with distinct knowledge about the online triggering system, ranging from casual testers to final system deployers. This level of automatization improves the overall ATLAS TDAQ work flow for software and hardware tests and speeds-up system modifications and deployment."
}
The ATLAS Collaboration. Commissioning of the atlas high level trigger with single beam and cosmic rays. In Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009. 2009.
Article:
@inproceedings{chep-2009-2,
    author = "Collaboration, The ATLAS",
    title = "Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays",
    booktitle = "Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009",
    year = "2009",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2009-2.pdf",
    abstract = "ATLAS is one of the two general-purpose detectors at the Large Hadron Collider (LHC). The trigger system is responsible for making the online selection of interesting collision events. At the LHC design luminosity of 10^34 cm^-2s^-1 it will need to achieve a rejection factor of the order of 10^-7 against random proton-proton interactions, while selecting with high efficiency events that are needed for physics analyses. After a first processing level using custom electronics based on FPGAs and ASICs, the trigger selection is made by software running on two processor farms, containing a total of around two thousand multi-core machines. This system is known as the High Level Trigger (HLT). To reduce the network data traffic and the processing time to manageable levels, the HLT uses seeded, step-wise reconstruction, aiming at the earliest possible rejection of background events. The recent LHC startup and short single-beam run provided a \'stress test\' of the system and some initial calibration data. Following this period, ATLAS continued to collect cosmic-ray events for detector alignment and calibration purposes. After giving an overview of the trigger design and its innovative features, this paper focuses on the experience gained from operating the ATLAS trigger with single LHC beams and cosmic-rays."
}
The ATLAS Collaboration. The atlas online high level trigger framework: experience reusing offline software components in the atlas trigger. In Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009. 2009.
Article:
@inproceedings{chep-2009,
    author = "Collaboration, The ATLAS",
    title = "The ATLAS online High Level Trigger framework: experience reusing offline software components in the ATLAS trigger",
    booktitle = "Computing in High Energy and Nuclear Physics, Prague, Czech Republic, 21 - 27 Mar 2009",
    year = "2009",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2009.pdf",
    abstract = "Event selection in the Atlas High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The Atlas High Level Trigger (HLT) framework based on the Gaudi and Atlas Athena frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of Atlas, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking periods with cosmic events and in a short period with proton beams from LHC. The contribution discusses the architectural aspects of the HLT framework, its performance and its software environment within the Atlas computing, trigger and data flow projects. Emphasis is also put on the architectural implications for the software by the use of multi-core processors in the computing farms and the experiences gained with multi-threading and multi-process technologies."
}
The ATLAS Collaboration. Atlas trigger status and results from commissioning operations. In Advanced Computing on High-Energy Physics 2008, Erice, Sicily, Italy. 2009.
Article:
@inproceedings{acat-2009,
    author = "Collaboration, The ATLAS",
    title = "ATLAS Trigger Status and Results From Commissioning Operations",
    booktitle = "Advanced Computing on High-Energy Physics 2008, Erice, Sicily, Italy",
    year = "2009",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/acat-2009.pdf",
    abstract = "The ATLAS trigger system is designed to select rare physics processes of interest from an extremely high rate of proton-proton collisions, reducing the LHC incoming rate of about 107. The short LHC bunch crossing period of 25 ns and the large background of soft-scattering events overlapped in each bunch crossing pose serious challenges, both on hardware and software, that the ATLAS trigger must overcome in order to efficiently select interesting events. The ATLAS trigger consists of hardware based Level-1, and a two-level software based High-Level Trigger (HLT). Data bandwidth and processing times in the higher level triggers are reduced by region of interest guidance in the HLT reconstruction steps. High flexibility is critical in order to adapt to the changing luminosity, backgrounds and physics goals. It is achieved by the use of inclusive trigger menus and modular software design. Selection algorithms have been developed which provide the required elasticity to detect different physics signatures and to control the trigger rates. In this paper an overview of the ATLAS trigger design, status and expected performance, as well as the results from the on-going commissioning with cosmic rays and first LHC beams, is presented."
}

2008

André Anjos. Trigger systems. In Experimental High-Energy Physics and Associated Technologies Workshop. 2008.
Article:
@inproceedings{talk-2008,
    author = "Anjos, André",
    title = "Trigger Systems",
    booktitle = "Experimental High-Energy Physics and Associated Technologies Workshop",
    year = "2008",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/talk-2008.pptx",
    abstract = "This is an invited talk. The contents were based on the fundamentals of Triggering System in High-Energy Physics experiments."
}
The ATLAS Collaboration. The atlas experiment at the cern large hadron collider. Journal of Instrumentation, August 2008.
Article:
@article{jinst-2008,
    author = "Collaboration, The ATLAS",
    title = "The ATLAS Experiment at the CERN Large Hadron Collider",
    journal = "Journal of Instrumentation",
    year = "2008",
    month = "August",
    OPTvolume = "",
    number = "S08003",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/jinst-2008.pdf",
    abstract = "The ATLAS detector as installed in its experimental cavern at point 1 at CERN is described in this paper. A brief overview of the expected performance of the detector when the Large Hadron Collider begins operation is also presented."
}
The ATLAS Collaboration. Readiness of the atlas trigger and data acquisition system for the first lhc beams. In 11th Topical Seminar On Innovative Particle And Radiation Detectors, Siena, Italy. 2008.
Article:
@inproceedings{iprd-2008,
    author = "Collaboration, The ATLAS",
    title = "Readiness of the ATLAS Trigger and Data Acquisition system for the first LHC beams",
    booktitle = "11th Topical Seminar On Innovative Particle And Radiation Detectors, Siena, Italy",
    year = "2008",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/iprd-2008.pdf",
    abstract = "The ATLAS Trigger and Data Acquisition (TDAQ) system is based on O(2k) processing nodes, interconnected by a multi-layer Gigabit network, and consists of a combination of custom electronics and commercial products. In its final configuration, O(20k) applications will provide the needed capabilities in terms of event selection, data flow, local storage and data monitoring. In preparation for the first LHC beams, many TDAQ sub-systems already reached the final configuration and roughly one third of the final processing power has been deployed. Therefore, the current system allows for a sensible evaluation of the performance and scaling properties. In this paper we introduce the ATLAS TDAQ system requirements and architecture and we discuss the status of software and hardware component. We moreover present the results of performance measurements validating the system design and providing a figure for the ATLAS data acquisition capabilities in the initial data taking period."
}
The ATLAS Collaboration. Expected performance of the atlas experiment detector, trigger, physics. Technical Report 2008–020, CERN Open Documentation, 2008.
Article:
@techreport{cern-2008,
    author = "Collaboration, The ATLAS",
    title = "Expected Performance of the ATLAS Experiment Detector, Trigger, Physics",
    institution = "CERN Open Documentation",
    year = "2008",
    OPTvolume = "",
    number = "2008--020",
    pdf = "https://www.idiap.ch/~aanjos/papers/cern-2008.pdf",
    abstract = "A detailed study is presented of the expected performance of the ATLAS detector. The reconstruction of tracks, leptons, photons, missing energy and jets is investigated, together with the performance of b-tagging and the trigger. The physics potential for a variety of interesting physics processes, within the Standard Model and beyond, is examined. The study comprises a series of notes based on simulations of the detector and physics processes, with particular emphasis given to the data expected from the first years of operation of the LHC at CERN."
}
André Anjos on behalf of the ATLAS Collaboration. The daq/hlt system of the atlas experiment. In International Workshop on Advanced Computing and Analysis Techniques in Physics Research. 2008.
Article:
@inproceedings{acat-2008,
    author = "on behalf of the ATLAS Collaboration, André Anjos",
    title = "The DAQ/HLT system of the ATLAS experiment",
    booktitle = "International Workshop on Advanced Computing and Analysis Techniques in Physics Research",
    year = "2008",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/acat-2008.pdf",
    abstract = "The DAQ/HLT system of the ATLAS experiment at CERN, Switzerland, is being commissioned for first collisions in 2009. Presently, the system is composed of an already very large farm of computers that accounts for about one-third of its event processing capacity. Event selection is conducted in two steps after the hardware-based Level-1 Trigger: a Level-2 Trigger processes detector data based on regions of interest (RoI) and an Event Filter operates on the full event data assembled by the Event Building system. The detector readout is fully commissioned and can be operated at its full design capacity. This places on the High-Level Triggers system the responsibility to maximize the quality of data that will finally reach the offline reconstruction farms. This paper brings an overview of the current ATLAS DAQ/HLT implementation and performance based on studies originated from its operation with simulated, cosmic particles and first-beam data. Its built-in event processing parallelism is discussed for both HLT levels as well as an outlook of options to improve it."
}

2007

Thiago Ciodaro Xavier, André Rabello Anjos, and José Manoel de Seixas. Discriminação neural de partículas para um detector submetido a uma alta taxa de eventos. Learning and Nonlinear Models - Revista da Sociedade Brasileira de Redes Neurais (SBRN), 4(2):79–92, October 2007.
Article:
@article{sbrn-2007,
    author = "Xavier, Thiago Ciodaro and Anjos, André Rabello and de Seixas, José Manoel",
    title = "Discriminação Neural de Partículas para um Detector Submetido a uma Alta Taxa de Eventos",
    journal = "Learning and Nonlinear Models - Revista da Sociedade Brasileira de Redes Neurais (SBRN)",
    year = "2007",
    month = "October",
    volume = "4",
    number = "2",
    pages = "79--92",
    pdf = "https://www.idiap.ch/~aanjos/papers/sbrn-2007.pdf",
    abstract = "This article (written in portuguese) presents the results of using neural networks for the optimization of the ATLAS online filtering system, one of the main detectors of the particle collider LHC (Large Hadron Collider). The Regions of interests of the ATLAS energy measurer calorimeter are mapped in 100 rings of energy deposition, which feed a classifier neural network to classify them as electron or jet. For the signal pre-processing, it is used a relevance mapping and PCA (Principal Component Analysis) to compact the information, increasing the processing speed and, eventually, increasing the detection efficiency, with a decreasing of false alarm rate."
}
The ATLAS Collaboration. Integration of the trigger and data acquisition systems in atlas. In IEEE Real-time conference. 2007.
Article:
@inproceedings{rt-2007-3,
    author = "Collaboration, The ATLAS",
    title = "Integration of the Trigger and Data Acquisition systems in ATLAS",
    booktitle = "IEEE Real-time conference",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/rt-2007-3.pdf",
    abstract = "During 2006 and spring 2007, integration and commissioning of trigger and data acquisition (TDAQ) equipment in the ATLAS experimental area has progressed. Much of the work has focused on a final prototype setup consisting of around eighty computers representing a subset of the full TDAQ system. There have been a series of technical runs using this setup. Various tests have been run including ones where around 6k Level-1 pre-selected simulated proton-proton events have been processed in a loop mode through the trigger and dataflow chains. The system included the readout buffers containing the events, event building, second level and third level trigger algorithms. Quantities critical for the final system, such as event processing times, have been studied using different trigger algorithms as well as different dataflow components."
}
André Anjos on behalf of the ATLAS Collaboration. The configuration system of the atlas trigger. In IEEE Real-time conference. 2007.
Article:
@inproceedings{rt-2007-2,
    author = "on behalf of the ATLAS Collaboration, André Anjos",
    title = "The Configuration System of the ATLAS Trigger",
    booktitle = "IEEE Real-time conference",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/rt-2007-2.pdf",
    abstract = "The ATLAS detector at CERN\'\s LHC will be exposed to proton-proton collisions at a rate of 40 MHz. To reduce the data rate, only potentially interesting events are selected by a three-level trigger system. The first level is implemented in custom-made electronics, reducing the data output rate to less than 100 kHz. The second and third level are software triggers with a final output rate of 100 to 200 Hz. A system has been designed and implemented that holds and records the configuration information of all three trigger levels at a centrally maintained location. This system provides consistent configuration information to the online trigger for the purpose of data taking as well as to the offline trigger simulation. The use of relational database technology provides a means of reliable recording of the trigger configuration history over the lifetime of the experiment. Tools for flexible browsing of trigger configurations, and for their distribution across the ATLAS reconstruction sites have been developed. The usability of this design has been demonstrated in dedicated configuration tests of the ATLAS level-1 Central Trigger and of a 600-node software trigger computing farm. Further tests on a computing cluster which is part of the final high level trigger system were also successful."
}
The ATLAS Collaboration. Performance of the final event builder for the atlas experiment. In 15th IEEE Real Time Conference 2007. 2007.
Article:
@inproceedings{rt-2007,
    author = "Collaboration, The ATLAS",
    title = "Performance of the final Event Builder for the ATLAS Experiment",
    booktitle = "15th IEEE Real Time Conference 2007",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/rt-2007.pdf",
    abstract = "Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three level trigger system, which reduces the initial bunch crossing rate of 40 MHz at its first two trigger levels (LVL1+LVL2) to ~3 kHz. At this rate the Event-Builder collects the data from all Read-Out system PCs (ROSs) and provides fully assembled events to the the Event-Filter (EF), which is the third level trigger, to achieve a further rate reduction to ~200 Hz for permanent storage. The Event-Builder is based on a farm of O(100) PCs, interconnected via Gigabit Ethernet to O(150) ROSs. These PCs run Linux and multi-threaded software applications implemented in C++. All the ROSs and one third of the Event-Builder PCs are already installed and commissioned. We report on performance tests on this initial system, which show promising results to reach the final data throughput required for the ATLAS experiment."
}
The ATLAS Collaboration. The atlas event builder. In IEEE Nuclear Science Symposium and Medical Imaging Conference. 2007.
Article:
@inproceedings{nss-2007,
    author = "Collaboration, The ATLAS",
    title = "The ATLAS Event Builder",
    booktitle = "IEEE Nuclear Science Symposium and Medical Imaging Conference",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/nss-2007.pdf",
    abstract = "Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three-level trigger system, which, at its first two trigger levels (LVL1+LVL2), reduces the initial bunch crossing rate of 40 MHz to ∼3 kHz. At this rate, the Event Builder collects the data from the readout system PCs (ROSs) and provides fully assembled events to the Event Filter (EF). The EF is the third trigger level and its aim is to achieve a further rate reduction to ∼200 Hz on the permanent storage. The Event Builder is based on a farm of O(100)."
}
The ATLAS Collaboration. The atlas trigger - high-level trigger commissioning and operation during early data taking. In International Europhysics Conference on High Energy Physics. 2007.
Article:
@inproceedings{eurochep-2007,
    author = "Collaboration, The ATLAS",
    title = "The ATLAS trigger - high-level trigger commissioning and operation during early data taking",
    booktitle = "International Europhysics Conference on High Energy Physics",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/eurochep-2007.pdf",
    abstract = "The ATLAS experiment is one of the two general-purpose experiments due to start operation soon at the Large Hadron Collider (LHC). The LHC will collide protons at a centre of mass energy of 14~TeV, with a bunch-crossing rate of 40~MHz. The ATLAS three-level trigger will reduce this input rate to match the foreseen offline storage capability of 100-200~Hz. This paper gives an overview of the ATLAS High Level Trigger focusing on the system design and its innovative features. We then present the ATLAS trigger strategy for the initial phase of LHC exploitation. Finally, we report on the valuable experience acquired through in-situ commissioning of the system where simulated events were used to exercise the trigger chain. In particular we show critical quantities such as event processing times, measured in a large-scale HLT farm using a complex trigger menu."
}
The ATLAS Collaboration. The atlas trigger - commissioning with cosmic rays. In International Conference on Computing in High Energy and Nuclear Physics. 2007.
Article:
@inproceedings{chep-2007-2,
    author = "Collaboration, The ATLAS",
    title = "The ATLAS Trigger - Commissioning with cosmic rays",
    booktitle = "International Conference on Computing in High Energy and Nuclear Physics",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2007-2.pdf",
    abstract = "The ATLAS detector at CERN\'s LHC will be exposed to proton-proton collisions from beams crossing at 40 MHz. At the design luminosity there are roughly 23 collisions per bunch crossing. ATLAS has designed a three-level trigger system to select potentially interesting events. The first-level trigger, implemented in custom-built electronics, reduces the incoming rate to less than 100 kHz with a total latency of less than 2.5s. The next two trigger levels run in software on commercial PC farms. They reduce the output rate to 100-200 Hz. In preparation for collision data-taking which is scheduled to commence in May 2008, several cosmic-ray commissioning runs have been performed. Among the first sub-detectors available for commissioning runs are parts of the barrel muon detector including the RPC detectors that are used in the first-level trigger. Data have been taken with a full slice of the muon trigger and readout chain, from the detectors in one sector of the RPC system, to the second-level trigger algorithms and the data-acquisition system. The system is being prepared to include the inner-tracking detector in the readout and second-level trigger. We will present the status and results of these cosmic-ray based commissioning activities. This work will prove to be invaluable not only during the commissioning phase but also for cosmic-ray data-taking during the normal running for detector performance studies."
}
The ATLAS Collaboration. Alignment data streams for the atlas inner detector. In Computing for High-Energy Physics. 2007.
Article:
@inproceedings{chep-2007,
    author = "Collaboration, The ATLAS",
    title = "Alignment data streams for the ATLAS Inner Detector",
    booktitle = "Computing for High-Energy Physics",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2007.pdf",
    abstract = "The ATLAS experiment uses a complex trigger strategy to be able to reduce the Event Filter rate output, down to a level that allows the storage and processing of these data. These concepts are described in the ATLAS Computing Model which embraces Grid paradigm. The output coming from the Event Filter consists of four main streams: physical stream, express stream, calibration stream, and diagnostic stream. The calibration stream will be transferred to the Tier-0 facilities that will provide the prompt reconstruction of this stream with a minimum latency of 8 hours, producing calibration constants of sufficient quality to allow a first-pass processing. The Inner Detector community is developing and testing an independent common calibration stream selected at the Event Filter after track reconstruction. It is composed of raw data, in byte-stream format, contained in Readout Buffers (ROBs) with hit information of the selected tracks, and it will be used to derive and update a set of calibration and alignment constants. This option was selected because it makes use of the Byte Stream Converter infrastructure and possibly gives better bandwidth usage and storage optimization. Processing is done using specialized algorithms running in the Athena framework in dedicated Tier-0 resources, and the alignment constants will be stored and distributed using the COOL conditions database infrastructure. This work is addressing in particular the alignment requirements, the needs for track and hit selection, and the performance issues."
}
Rodrigo Coura Torres, José Manoel Seixas, André Rabello dos Anjos, and Danilo Vannier Cunha. Online electron/jet neural high-level trigger over indenpendent calorimetry information. In XI International Workshop on Advanced Computing and Analysis Techniques in Physics Research. 2007.
Article:
@inproceedings{acat-2007,
    author = "Torres, Rodrigo Coura and Seixas, José Manoel and dos Anjos, André Rabello and Cunha, Danilo Vannier",
    title = "Online Electron/Jet Neural High-Level Trigger over Indenpendent Calorimetry Information",
    booktitle = "XI International Workshop on Advanced Computing and Analysis Techniques in Physics Research",
    year = "2007",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/acat-2007.pdf",
    abstract = "A data volume of 60 TB/s is expected from the high LHC collision rate and high resolution of the ATLAS detectors. To cope with this bandwidth, a highly programmable, three-level online triggering system is under development. One of the main components of this system is an electron/jet discriminator that uses the highly segmented calorimetry information. In this work, we address the electron/jet discrimination at the second-level trigger by building a set of concentric ring sums around the energy deposition peak in each calorimeter segment. An Independent Component Analysis (ICA) on these ring sums is then performed to extract the main sources of the calorimeter signal. The extracted independent components feed the input nodes of a neural electron/jet discriminator. The proposed system is able to achieve higher detection efficiency than the current electron/jet discriminating system operating in ATLAS, while being fast enough to cope with the time restrictions of the ATLAS triggering system operation."
}

2006

André Anjos. Sistema Online de Filtragem em um Ambiente com Alta Taxa de Eventos. PhD thesis, COPPE/UFRJ, 2006.
Article:
@phdthesis{phd-thesis-2006,
    author = "Anjos, André",
    title = "Sistema Online de Filtragem em um Ambiente com Alta Taxa de Eventos",
    school = "COPPE/UFRJ",
    year = "2006",
    pdf = "https://www.idiap.ch/~aanjos/papers/phd-thesis-2006.zip",
    abstract = "The ATLAS experiment at CERN, Switzerland, will count on a triggering system that separates the ordinary physics from the one representing decays of the rare Higgs boson. The Second Level of such a Trigger system will be composed 1,000 computers connected by commodity networks, processing each event approved the First Level Trigger in no more than 10 milliseconds. A set of algorithms described via software will operate in this filtering level. Among them, electron detection systems play a fundamental role to the data acquisition since the existence of these particles can represent interesting physics. In this work, we present more efficient discrimination algorithms based on artificial neural networks and a compaction system which benefits from the energy deposit profiles of these particles in calorimeters, reaching a classification efficiency of 97.6\\% for electrons for a false-alarm of only 3.2\\% in jets. This detection algorithm is implemented as part of the experiment\'s complex software infraestructure and can be executed in only 125 microseconds."
}
The ATLAS Collaboration. The atlas data acquisition and trigger : concept, design and status. Nucl. Phys. B, Proc. Suppl., 172:178–182, November 2006.
Article:
@article{nimb-2006,
    author = "Collaboration, The ATLAS",
    title = "The ATLAS Data Acquisition and Trigger : concept, design and status",
    journal = "Nucl. Phys. B, Proc. Suppl.",
    year = "2006",
    month = "November",
    volume = "172",
    OPTnumber = "",
    pages = "178--182",
    pdf = "https://www.idiap.ch/~aanjos/papers/nimb-2006.pdf",
    abstract = "This article presents the base-line design and implementation of the ATLAS Trigger and Data Acquisition system, in particular the Data Flow and High Level Trigger components. The status of the installation and commissioning of the system is also presented."
}
A. Anjos, R.C. Torres, J.M. Seixas, B.C. Ferreira, and T.C. Xavier. Neural triggering system operating on high resolution calorimetry information. Nuclear Instruments and Methods in Physics Research, 559:134–138, April 2006.
Article:
@article{nima-2006,
    author = "Anjos, A. and Torres, R.C. and Seixas, J.M. and Ferreira, B.C. and Xavier, T.C.",
    title = "Neural triggering system operating on high resolution calorimetry information",
    journal = "Nuclear Instruments and Methods in Physics Research",
    year = "2006",
    month = "April",
    volume = "559",
    OPTnumber = "",
    pages = "134--138",
    pdf = "https://www.idiap.ch/~aanjos/papers/nima-2006.pdf",
    abstract = "This paper presents an electron/jet discriminator system for operating at the Second Level Trigger of ATLAS. The system processes calorimetry data and organizes the regions of interest in the calorimeter in the form of concentric ring sums of energy deposition, so that both signal compaction and high performance can be achieved. The ring information is fed into a feed forward neural discriminator. This implementation resulted on a 97\\% electron detection efficiency for a false alarm of 3\\%. The full discrimination chain could still be executed in less than 500 microseconds."
}
André Anjos on behalf of the ATLAS Collaboration. A configuration system for the atlas trigger. Journal of Instrumentation, Institute of Physics Publishing and Sissa, February 2006.
Article:
@article{jinst-2006,
    author = "on behalf of the ATLAS Collaboration, André Anjos",
    title = "A configuration system for the ATLAS trigger",
    journal = "Journal of Instrumentation, Institute of Physics Publishing and Sissa",
    year = "2006",
    month = "February",
    OPTvolume = "",
    number = "P05004",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/jinst-2006.pdf",
    abstract = "The ATLAS detector at CERN\'s Large Hadron Collider will be exposed to proton–proton collisions from beams crossing at 40 MHz that have to be reduced to the few hundreds of Hz allowed by the storage systems. A three-level trigger system has been designed to achieve this goal. We describe the configuration system under construction for the ATLAS trigger chain. It provides the trigger system with all the parameters required for decision taking and to record its history. The same system configures the event reconstruction, Monte Carlo simulation and data analysis, and provides tools for accessing and manipulating the configuration data in all contexts."
}
André Anjos on behalf of the ATLAS Collaboration. Deployment of the atlas high-level trigger. IEEE Transactions on Nuclear Science, 53:2144–2149, August 2006.
Article:
@article{ieee-tns-2006,
    author = "on behalf of the ATLAS Collaboration, André Anjos",
    title = "Deployment of the ATLAS High-Level Trigger",
    journal = "IEEE Transactions on Nuclear Science",
    year = "2006",
    month = "August",
    volume = "53",
    OPTnumber = "",
    pages = "2144--2149",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2006.pdf",
    abstract = "The ATLAS combined test beam in the second half of 2004 saw the first deployment of the ATLAS High-Level Trigger (HLT). The next steps are deployment on the pre-series farms in the experimental area during 2005, commissioning and cosmics tests with the full detector in 2006 and collisions in 2007. This paper reviews the experience gained in the test beam, describes the current status and discusses the further enhancements to be made. We address issues related to the dataflow, integration of selection algorithms, testing, software distribution, installation and improvements."
}
The ATLAS Collaboration. Testing on a large scale: running the atlas data acquisition and high level trigger software on 700 pc nodes. In Computing In High Energy and Nuclear Physics. 2006.
Article:
@inproceedings{chep-2006-3,
    author = "Collaboration, The ATLAS",
    title = "Testing on a Large Scale: running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes",
    booktitle = "Computing In High Energy and Nuclear Physics",
    year = "2006",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2006-3.doc",
    abstract = "The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. The findings obtained during the tests lead to many immediate improvements in the software. Trend analysis allowed identifying critical areas. Running an online system on a cluster of 700 nodes successfully was found to be especially sensitive to the reliability of the farm as well as the DAQ/HLT system itself and the future development will concentrate on fault tolerance and stability."
}
The ATLAS Collaboration. Atlas high level trigger infrastructure, roi collection and event building. In 15th International Conference on Computing In High Energy and Nuclear Physics. 2006.
Article:
@inproceedings{chep-2006-2,
    author = "Collaboration, The ATLAS",
    title = "ATLAS High Level Trigger Infrastructure, ROI Collection and Event Building",
    booktitle = "15th International Conference on Computing In High Energy and Nuclear Physics",
    year = "2006",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2006-2.pdf",
    abstract = "We describe the base-line design and implementation of the Data Flow and High Level Trigger (HLT) part of the ATLAS Trigger and Data Acquisition (TDAQ) system. We then discuss improvements and generalization of the system design to allow the handling of events in parallel data streams and we present the possibility for event duplication, partial Event Building and data stripping. We then present tests on the deployment and integration of the TDAQ infrastructure and algorithms at the TDAQ \'pre-series\' cluster (~10\\% of full ATLAS TDAQ). Finally, we tackle two HLT performance issues."
}
The ATLAS Collaboration. Studies with the atlas trigger and data acquisition pre-series setup. In 15th International Conference on Computing In High Energy and Nuclear Physics. 2006.
Article:
@inproceedings{chep-2006,
    author = "Collaboration, The ATLAS",
    title = "Studies with the ATLAS Trigger and Data Acquisition Pre-Series Setup",
    booktitle = "15th International Conference on Computing In High Energy and Nuclear Physics",
    year = "2006",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2006.pdf",
    abstract = "The pre-series test bed is used to validate the technology and implementation choices by comparing the final ATLAS readout requirements, to the results of performance, functionality and stability studies. We show that all the components which are not running reconstruction algorithms match the final ATLAS requirements. For the others, we calculate the amount of time per event that could be allocated to run these not-yet-finalized algorithms. We also report on the experience gained during these studies while interfacing with a sub-detector for the first time at the experimental area."
}

2005

André Anjos on behalf of the ATLAS Collaboration. Configuration of the atlas trigger. In 14th IEEE NPSS Real Time Conference, 990–994. 2005.
Article:
@inproceedings{rt-2005,
    author = "on behalf of the ATLAS Collaboration, André Anjos",
    title = "Configuration of the ATLAS trigger",
    booktitle = "14th IEEE NPSS Real Time Conference",
    year = "2005",
    OPTvolume = "",
    OPTnumber = "",
    pages = "990--994",
    pdf = "https://www.idiap.ch/~aanjos/papers/rt-2005.pdf",
    abstract = "The ATLAS detector at CERN\'s LHC will be exposed to proton-proton collisions at a rate of 40 MHz. In order to reduce the data rate to about 200 Hz, only potentially interesting events are selected by a three-level trigger system. Its first level is implemented in electronics and firmware whereas the higher trigger levels are based on software. To prepare the full trigger chain for the online event selection according to a certain strategy, a system is being set up that provides the relevant configuration information - e.g. values for hardware registers in level-1 or parameters of high-level trigger algorithms - and stores the corresponding history. The same information is used to configure the offline trigger simulation. In this presentation an overview of the ATLAS trigger system is given concentrating on the event selection strategy and its description. The technical implementation of the configuration system is summarized."
}
The ATLAS Collaboration. Overview of the high-level trigger electron and photon selection for the atlas experiment at the lhc. IEEE Transactions Nuclear Sciences (2005), 53:2839–2843, June 2005.
Article:
@article{ieee-tns-2005-3,
    author = "Collaboration, The ATLAS",
    title = "Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC",
    journal = "IEEE Transactions Nuclear Sciences (2005)",
    year = "2005",
    month = "June",
    volume = "53",
    OPTnumber = "",
    pages = "2839--2843",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2005-3.pdf",
    abstract = "The ATLAS experiment at the Large Hadron Collider (LHC) will face the challenge of efficiently selecting interesting candidate events in pp collisions at 14 TeV center-of-mass energy, whilst rejecting the enormous number of background events. The High-Level Trigger (HLT = second level trigger and Event Filter), which is a software based trigger will need to reduce the level-1 output rate of ~75 kHz to ~200 Hz written out to mass storage. In this talk an overview of the current physics and system performance of the HLT selection for electrons and photons is given. The performance has been evaluated using Monte Carlo simulations and has been partly demonstrated in the ATLAS testbeam in 2004. The efficiency for the signal channels, the rate expected for the selection, the global data preparation and execution times will be highlighted. Furthermore, some physics examples will be discussed to demonstrate that the triggers are well adapted for the physics programme envisaged at the LHC."
}
The ATLAS Collaboration. Implementation and performance of the seeded reconstruction for the atlas event filter selection software. IEEE Trans. Nucl. Sciences, 53 (2007):864–869, June 2005.
Article:
@article{ieee-tns-2005-2,
    author = "Collaboration, The ATLAS",
    title = "Implementation and Performance of the Seeded Reconstruction for the ATLAS Event Filter Selection Software",
    journal = "IEEE Trans. Nucl. Sciences",
    year = "2005",
    month = "June",
    volume = "53 (2007)",
    OPTnumber = "",
    pages = "864--869",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-2005-2.pdf",
    abstract = "ATLAS is one of the four LHC experiments that will start data taking in 2007, designed to cover a wide range of physics topics. The ATLAS trigger system has to cope with a rate of 40 MHz and 23 interactions per bunch crossing. It is divided in three different levels. The first one (hardware based) provides a signature that is confirmed by the the following trigger levels (software based) by running a sequence of algorithms and validating the signal step by step, looking only to the region of the space indicated by the first trigger level (seeding). In this paper, the performance of one of these sequences that run at the Event Filter level (third level) and is composed of clustering at the calorimeter, track reconstruction and matching."
}
The ATLAS Collaboration. Atlas dataflow: the read-out subsystem, results from trigger and data-acquisition system testbed studies and from modeling. IEEE Trans. Nucl. Sciences, 53 (2006):912–917, June 2005.
Article:
@article{ieee-tns-2005,
    author = "Collaboration, The ATLAS",
    title = "ATLAS DataFlow: the Read-Out Subsystem, Results from Trigger and Data-Acquisition System Testbed Studies and from Modeling",
    journal = "IEEE Trans. Nucl. Sciences",
    year = "2005",
    month = "June",
    volume = "53 (2006)",
    OPTnumber = "",
    pages = "912--917",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2005.pdf",
    abstract = "In the ATLAS experiment at the LHC, the output of readout hardware specific to each subdetector will be transmitted to buffers, located on custom made PCI cards (ROBINs). The data consist of fragments of events accepted by the first-level trigger at a maximum rate of 100 kHz. Groups of four ROBINs will be hosted in about 150 Read-Out Subsystem (ROS) PCs. Event data are forwarded on request via Gigabit Ethernet links and switches to the second-level trigger or to the Event builder. In this paper a discussion of the functionality and real-time properties of the ROS is combined with a presentation of measurement and modelling results for a testbed with a size of about 20\\% of the final DAQ system. Experimental results on strategies for optimizing the system performance, such as utilization of different network architectures and network transfer protocols, are presented for the testbed, together with extrapolations to the full system."
}
The ATLAS Collaboration. Implementation and performance of a tau lepton selection within the atlas trigger system at the lhc. In 9th ICATPP Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications. 2005.
Article:
@inproceedings{icatpp-2005,
    author = "Collaboration, The ATLAS",
    title = "Implementation and performance of a tau lepton selection within the ATLAS trigger system at the LHC",
    booktitle = "9th ICATPP Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications",
    year = "2005",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/icatpp-2005.pdf",
    abstract = "The ATLAS experiment at the Large Hadron Collider (LHC) has an interaction rate of up to 1 GHz. The trigger must efficiently select interesting events while rejecting the large amount of background. The First Level trigger will reduce this rate to around O(75 kHz ). Subsequently, the High Level Trigger (HLT), comprising the Second Level trigger and the Event Filter, will reduce this rate by a factor of O(1000). Triggering on taus is important for Higgs and SUSY searches at the LHC. In this paper tau trigger selections are presented based on a lepton trigger if the tau decays leptonically or via a dedicated tau hadron trigger if the tau disintegrates semileptonically. We present signal efficiency with the electron trigger using the data sample A=tau+tau=e+hadron, and rate studies obtained from the dijet sample."
}
The ATLAS Collaboration. Muon reconstruction and identification for the event filter of the atlas experiment. In 9th ICATAPP Conference on High Energy Physics. 2005.
Article:
@inproceedings{icatapp-2005-2,
    author = "Collaboration, The ATLAS",
    title = "Muon Reconstruction and Identification for the Event Filter of the ATLAS experiment",
    booktitle = "9th ICATAPP Conference on High Energy Physics",
    year = "2005",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/icatpp-2005-2.pdf",
    abstract = "The ATLAS Trigger requires high efficiency and selectivity in order to keep the full physics potential of the experiment and to reject uninteresting processes from the 40 MHz event production rate of the LHC. These goals are achieved with a trigger composed of three sequential levels of increasing accuracy that have to reduce the output event rate down to ~100 Hz. This work focuses on muon reconstruction and identification for the third level (Event Filter), for which specific algorithms from the off-line environment have been adapted to work in the trigger framework. Two different strategies for accessing data (wrapped and seeded modes) are described and their reconstruction potential is then shown in terms of efficiencies, resolutions and fake muon rejection power."
}
A. Anjos, R. C. Torres, B.C. Ferreira, T.C. Xavier, J.M Seixas, and D.O. Damazio. Otimização do sistema de trigger do segundo nível do atlas baseado em calorimetria. In XXVI Encontro Nacional de Física de Partículas e Campos. 2005.
Article:
@inproceedings{enfpc-2005-2,
    author = "Anjos, A. and Torres, R. C. and Ferreira, B.C. and Xavier, T.C. and Seixas, J.M and Damazio, D.O.",
    title = "Otimização do Sistema de Trigger do Segundo Nível do ATLAS Baseado em Calorimetria",
    booktitle = "XXVI Encontro Nacional de Física de Partículas e Campos",
    year = "2005",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/enfpc-2005-2.pdf",
    abstract = "Este trabalho apresenta um discriminador neural que opera sobre as quantidades calculadas pelo algoritmo T2Calo, responsável pela deteção elétron/jato no Segundo Nível de Filtragem do experimento ATLAS. Este sistema de deteção melhora a eficiência de deteção em quase 10 pontos percentuais, mantendo um nível de desempenho compatível com as restrições operacionais do sistema de filtragem."
}
A. Anjos, R. C. Torres, B. C. Ferreira, T. C. Xavier, and J. M. de Seixas. Discriminação neural de elétrons no segundo nível de trigger do atlas. In XXVI Encontro Nacional de Física de Partículas e Campos. 2005.
Article:
@inproceedings{enfpc-2005,
    author = "Anjos, A. and Torres, R. C. and Ferreira, B. C. and Xavier, T. C. and  and de Seixas, J. M.",
    title = "Discriminação Neural de Elétrons no Segundo Nível de Trigger do ATLAS",
    booktitle = "XXVI Encontro Nacional de Física de Partículas e Campos",
    year = "2005",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/enfpc-2005.pdf",
    abstract = "Este trabalho apresenta um discriminador neural para o segundo nível de filtragem do ATLAS, atuando no problema de separação elétron/jato baseado em informações de calorimetria. Para reduzir a alta dimensionalidade dos dados de entrada, as regiões de interesse (RoI) identificadas no primeiro nível são organizadas em anéis concêntricos de deposição energética. Este tipo de pré-processamento dos dados permite eficiente compactação dos sinais e alcança elevada capacidade de identificar elétrons. Atualmente, esse sistema vem sendo portado para o ambiente de emulação do sistema de filtragem ATHENA, de modo a se obter uma avaliação realística de seu desempenho. O ambiente tem o objetivo de simular o comportamento do sistema de filtragem, ajudando, desta forma, no desenvolvimento e validação dos algoritmos. Em caráter comparativo, o sistema proposto foi também implementado usando a tecnologia DSP."
}

2004

The ATLAS Collaboration. Implementation and performance of the high level trigger electron and photon selection for the atlas experiment at the lhc. In IEEE Nuclear Science Symposium and Medical Imaging Conference. 2004.
Article:
@inproceedings{nss-2004-3,
    author = "Collaboration, The ATLAS",
    title = "Implementation and Performance of the High Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC",
    booktitle = "IEEE Nuclear Science Symposium and Medical Imaging Conference",
    year = "2004",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/nss-2004-3.pdf",
    abstract = "The ATLAS experiment at the Large Hadron Collider (LHC) will face the challenge of efficiently selecting interesting candidate events in pp collisions at 14 TeV center of mass energy, while rejecting the enormous number of background events, stemming from an interaction rate of up to 10^9 Hz. The First Level trigger will reduce this rate to around O(100 kHz). Subsequently, the High Level Trigger (HLT), which is comprised of the Second Level trigger and the Event Filter, will need to further reduce this rate by a factor of O(10^3). The HLT selection is software based and will be implemented on commercial CPUs, using a common framework built on the standard ATLAS object oriented software architecture. In this paper an overview of the current implementation of the selection for electrons and photons in the HLT is given. The performance of this implementation has been evaluated using Monte Carlo simulations in terms of the efficiency for the signal channels, rate expected for the selection, data preparation times, and algorithm execution times. Besides the efficiency and rate estimates, some physics examples will be discussed, showing that the triggers are well adapted for the physics programme envisaged at LHC. The electron and photon trigger software is also being exercised at the ATLAS 2004 Combined Test Beam, where components from all ATLAS subdetectors are taking data together along the H8 SPS extraction line; from these tests a validation of the selection architecture chosen in a real on-line environment is expected."
}
The ATLAS Collaboration. Design, deployment and functional tests of the on-line event filter for the atlas experiment at lhc. In Nuclear Science Symposium and Medical Imaging Conference. 2004.
Article:
@inproceedings{nss-2004-2,
    author = "Collaboration, The ATLAS",
    title = "Design, deployment and functional tests of the on-line Event Filter for the ATLAS experiment at LHC",
    booktitle = "Nuclear Science Symposium and Medical Imaging Conference",
    year = "2004",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/nss-2004-2.pdf",
    abstract = "The Event Filter selection stage is a fundamental component of the ATLAS Trigger and Data Acquisition architecture. Its primary function is the reduction of data flow and rate to values acceptable by the mass storage operations and by the subsequent off-line data reconstruction and analysis steps. The computing instrument of the EF is generally organized as a set of independent sub-farms, each connected to one output of the Event Builder switch fabric. Each sub-farm comprises a number of processors analyzing several complete events in parallel. This paper describes the design of the ATLAS EF system, its deployment in the 2004 ATLAS combined test beam together with some examples of integrating selection and monitoring algorithms. Since the processing algorithms are not specially designed for EF but are inherited as much as possible from the off-line ones, special emphasis is reserved to system reliability and data security, in particular for the case of failures in the processing algorithms. Another key design element has been system modularity and scalability. The EF shall be able to follow technology evolution and should allow for using additional processing resources possibly remotely located."
}
The ATLAS Collaboration. Online muon reconstruction in the atlas level-2 trigger system. In Nuclear Science Symposium and Medical Imaging Conference. 2004.
Article:
@inproceedings{nss-2004,
    author = "Collaboration, The ATLAS",
    title = "Online Muon Reconstruction in the ATLAS Level-2 trigger system",
    booktitle = "Nuclear Science Symposium and Medical Imaging Conference",
    year = "2004",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/nss-2004.pdf",
    abstract = "To cope with the 40 MHz event production rate of LHC, the trigger of the ATLAS experiment selects the events in three sequential steps of increasing complexity and accuracy whose final results are close to the offline reconstruction. The Level-1, implemented with custom hardware, identifies physics objects within Regions of Interests and operates a first reduction of the event rate to 75 KHz. The higher trigger levels provide a software based event selection which further reduces the event rate to about 100 Hz. This paper presents the algorithm (muFast) employed at Level-2 to confirm the muon candidates flagged by the Level-1. muFast identifies hits of muon tracks inside the Muon Spectrometer and provides a precise measurement of the muon momentum at the production vertex. The algorithm must process the Level-1 muon output rate (~20 KHz), thus a particular care has been used for its optimization. The result is a very fast track reconstruction algorithm with good physics performances which, in some cases, approach those of the offline reconstruction: it computes the pT of prompt muons with a resolution of 5.5\\% at 6 GeV and 4.0\\% at 20 GeV and with an efficiency of about 95\\%. The algorithm requires an overall execution time of ~1 ms on a 100 SpecInts95 machine."
}
The ATLAS Collaboration. Architecture of the atlas high level trigger event selection software. Nucl. Instrum. Methods Phys. Res., 518(1–2):537–541, February 2004.
Article:
@article{nima-2004,
    author = "Collaboration, The ATLAS",
    title = "Architecture of the ATLAS high level trigger event selection software",
    journal = "Nucl. Instrum. Methods Phys. Res.",
    year = "2004",
    month = "February",
    volume = "518",
    number = "1--2",
    pages = "537--541",
    pdf = "https://www.idiap.ch/~aanjos/papers/nima-2004.pdf",
    abstract = "The ATLAS High Level Trigger (HLT) consists of two selection steps: the second level trigger and the event filter. Both will be implemented in software, running on mostly commodity hardware. Both levels have a coherent approach to event selection, so a common core software framework has been designed to maximize this coherency, while allowing sufficient flexibility to meet the different interfaces and requirements of the two different levels. The approach is extended further to allow the software to run in an off-line simulation and reconstruction environment for the purposes of development. This paper describes the architecture and high level design of the software."
}
J.T. Baines, C.P. Bee, A. Bogaerts, M. Bosman, D. Botterill, B. Caron, A. Anjos, F. Etienne, S. González, K. Karr, W. Li, C. Meessen, G. Merino, A. Negri, J. L. Pinfold, P. Pinto, Z. Qian, F. Touchard, P. Werner, S. Wheeler, F.J. Wickens, W. Wiedenmann, and G. Zobernig. An overview of the atlas high-level trigger dataflow and supervision. IEEE Transaction on Nuclear Science, 51(3):361–366, June 2004.
Article:
@article{ieee-tns-2004-7,
    author = "Baines, J.T. and Bee, C.P. and Bogaerts, A. and Bosman, M. and Botterill, D. and Caron, B. and Anjos, A. and Etienne, F. and González, S. and Karr, K. and Li, W. and Meessen, C. and Merino, G. and Negri, A. and Pinfold, J. L. and Pinto, P. and Qian, Z. and Touchard, F. and Werner, P. and Wheeler, S. and Wickens, F.J. and Wiedenmann, W. and  and Zobernig, G.",
    title = "An Overview of the ATLAS High-Level Trigger Dataflow and Supervision",
    journal = "IEEE Transaction on Nuclear Science",
    year = "2004",
    month = "June",
    volume = "51",
    number = "3",
    pages = "361--366",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2004-7.pdf",
    abstract = ""
}
André Anjos on behalf of the ATLAS Collaboration. The second level trigger of the atlas experiment at cernś lhc. IEEE Transaction on Nuclear Science, 51(3):909–914, July 2004.
Article:
@article{ieee-tns-2004-6,
    author = "on behalf of the ATLAS Collaboration, André Anjos",
    title = "The Second Level Trigger of the ATLAS Experiment at CERN\'s LHC",
    journal = "IEEE Transaction on Nuclear Science",
    year = "2004",
    month = "July",
    volume = "51",
    number = "3",
    pages = "909--914",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2004-6.pdf",
    abstract = "The Trigger System of the ATLAS experiment reduces the rate of events produced by proton-proton collisions at CERN\'s Large Hadron Collider (LHC) in three successive steps from 40 MHz to \textasciitilde\ 100, 1 and 0.2 kHz respectively. The ATLAS Second Level Trigger is original in several ways. It makes use of information provided by the First Level Trigger which identifies Regions of Interest (RoI) indicating where the most significant activity has occurred within the detector. Accessing detector data in RoIs only reduces the estimated 100 Gbytes/s data rate by a factor 100. Appart from a custom interface to acquire the RoI information, the Second Level Trigger is implemented in software. Another cost saving approach is the development of Trigger Selection software in an offline environment using a common framework for the High Level Trigger and Reconstruction Software. Consequently, the Second Level Trigger draws on software developed in two largely independant domains: real time oriented dataflow software combined with offline selection software. In this paper we report on experience gained and results obtained with second generation prototype software of both domains. Test of the performance of the data collection have been carried out on testbeds consisting of PCs running Linux and interconnected by Gbit Ethernet switches. The selection software has been tested using simulated detector data preloaded in detector readout buffers."
}
The ATLAS Collaboration. Algorithms for the atlas high-level trigger. IEEE Transactions on Nuclear Science, 51(3):367–374, June 2004.
Article:
@article{ieee-tns-2004-5,
    author = "Collaboration, The ATLAS",
    title = "Algorithms for the ATLAS high-level trigger",
    journal = "IEEE Transactions on Nuclear Science",
    year = "2004",
    month = "June",
    volume = "51",
    number = "3",
    pages = "367--374",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2004-5.pdf",
    abstract = "Following rigorous software design and analysis methods, an object-based architecture has been developed to derive the second- and third-level trigger decisions for the future ATLAS detector at the LHC. The functional components within this system responsible for generating elements of the trigger decisions are algorithms running within the software architecture. Relevant aspects of the architecture are reviewed along with concrete examples of specific algorithms and their performance in \'vertical\' slices of various physics selection strategies."
}
The ATLAS Collaboration. The base-line dataflow system of the atlas trigger and daq. IEEE Transactions on Nuclear Science, 51(3):470–475, June 2004.
Article:
@article{ieee-tns-2004-4,
    author = "Collaboration, The ATLAS",
    title = "The base-line DataFlow system of the ATLAS Trigger and DAQ",
    journal = "IEEE Transactions on Nuclear Science",
    year = "2004",
    month = "June",
    volume = "51",
    number = "3",
    pages = "470--475",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2004-4.pdf",
    abstract = "The base-line design and implementation of the ATLAS DAQ DataFlow system is described. The main components realizing the DataFlow system, their interactions, bandwidths and rates are being discussed and performance measurements on a 10\\% scale prototype for the final Atlas TDAQ DataFlow system are presented. This prototype is a combination of custom design components and of multi-threaded software applications implemented in C++ and running in a Linux environment on commercially available PCs interconnected by a fully switched gigabit Ethernet network."
}
The ATLAS Collaboration. Atlas tdaq data collection software. IEEE Transactions on Nuclear Science, 51:585–590, June 2004.
Article:
@article{ieee-tns-2004-3,
    author = "Collaboration, The ATLAS",
    title = "ATLAS TDAQ data collection software",
    journal = "IEEE Transactions on Nuclear Science",
    year = "2004",
    month = "June",
    volume = "51",
    OPTnumber = "",
    pages = "585--590",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2004-3.pdf",
    abstract = "The DataCollection (DC) is a subsystem of the ATLAS Trigger and DAQ system. It is responsible for the movement of event data from the ReadOut subsystem to the Second Level Trigger and to the Event Filter. This functionality is distributed on several software applications running on Linux PCs interconnected with Gigabit Ethernet. For the design and implementation of these applications a common approach has been adopted. This approach leads to the design and implementation of a common DC software framework providing a suite of common services."
}
The ATLAS Collaboration. Studies for a common selection software environment in atlas : from the level-2 trigger to the offline reconstruction. IEEE Transactions on Nuclear Science, 51(3):915–920, June 2004.
Article:
@article{ieee-tns-2004-2,
    author = "Collaboration, The ATLAS",
    title = "Studies for a common selection software environment in ATLAS : from the Level-2 Trigger to the offline reconstruction",
    journal = "IEEE Transactions on Nuclear Science",
    year = "2004",
    month = "June",
    volume = "51",
    number = "3",
    pages = "915--920",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2004-2.pdf",
    abstract = "The ATLAS High Level Trigger\'s primary function of event selection will be accomplished with a Level-2 trigger farm and an Event Filter farm, both running software components developed in the Atlas offline reconstruction framework. While this approach provides a unified software framework for event selection, it poses strict requirements on offline components critical for the Level-2 trigger. A Level-2 decision in Atlas must typically be accomplished within 10 ms and with multiple event processing in concurrent threads. In order to address these constraints, prototypes have been developed that incorporate elements of the Atlas Data Flow -, High Level Trigger -, and offline framework software. To realize a homogeneous software environment for offline components in the High Level Trigger, the Level-2 Steering Controller was developed. With electron/gamma- and muon-selection slices it has been shown that the required performance can be reached, if the offline components used are carefully designed and optimized for the application in the High Level Trigger."
}
A. Anjos and J.M. Seixas. Os filtros de alto nível do experimento atlas. In XXVI Encontro Nacional de Física de Partículas e Campos. 2004.
Article:
@inproceedings{enfpc-2004,
    author = "Anjos, A. and Seixas, J.M.",
    title = "Os Filtros de Alto Nível do Experimento ATLAS",
    booktitle = "XXVI Encontro Nacional de Física de Partículas e Campos",
    year = "2004",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/enfpc-2004.pdf",
    abstract = "O Experimento ATLAS conta com um sistema de Filtragem bastante complexo e dividido em 4 grandes sub-sistemas: (i) O Primeiro Nível de Filtragem que realiza os primeiros passos da seleção de eventos na cadeia de filtragem; (ii) O Software Online, responsável pelas áreas de controle e operação do sistema;(iii) O Sistema de Fluxo de Dados (Dataflow) que coordena a transmissão e armazenamento dos dados do detetor do experimento; (iv) Os Filtros de Alto Nível, que implementam os algoritmos de discriminação, representando o topo da cadeia de seleção de eventos no ATLAS. Para aumentar a portabilidade entre os algoritmos de filtragem desenvolvidos por toda a comunidade do experimento, os desenvolvedores dos Filtros de Alto Nível (ou simplesmente HLT; High-Level Triggers) propuseram a reutilização do ambiente de programação online Athena dentro do sistema que operará em tempo real. Para tal, o HLT utiliza as ferramentas propostas pelo sub-sistema de Fluxo de Dados para coordenar as operações da transferência de informação para dentro e para fora dos nós de processamento sistema. Dentre outras restrições, o produto final deverá ser suficientemente rápido e operável em tarefas concorrentes em máquinas (SMP) com vários processadores rodando Linux. Neste trabalho apresentamos alguns dos problemas e soluções encontrados pelo grupo no desenvolvimento e teste do conjunto de bibliotecas que compõe o HLT."
}
The ATLAS Collaboration. Performance of the atlas daq dataflow system. In Computing in High Energy Physics and Nuclear Physics. 2004.
Article:
@inproceedings{chep-2004-2,
    author = "Collaboration, The ATLAS",
    title = "Performance of the ATLAS DAQ DataFlow system",
    booktitle = "Computing in High Energy Physics and Nuclear Physics",
    year = "2004",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2004-2.pdf",
    abstract = "The baseline DAQ architecture of the ATLAS Experiment at LHC is introduced and its present implementation and the performance of the DAQ components as measured in a laboratory environment are summarized. It will be shown that the discrete event simulation model of the DAQ system, tuned using these measurements, does predict the behaviour of the prototype configurations well, after which, predictions for the final ATLAS system are presented. With the currently available hardware and software, a system using ~140 ROSs with 3GHz single cpu, ~100 SFIs with dual 2.4 GHz cpu and ~500 L2PUs with dual 3.06 GHz cpu."
}
The ATLAS Collaboration. Portable gathering system for monitoring and online calibration at atlas. In Computing in High Energy Physics and Nuclear Physics 2004. 2004.
Article:
@inproceedings{chep-2004,
    author = "Collaboration, The ATLAS",
    title = "Portable Gathering System for Monitoring and Online Calibration at ATLAS",
    booktitle = "Computing in High Energy Physics and Nuclear Physics 2004",
    year = "2004",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2004.pdf",
    abstract = "During the runtime of any experiment, a central monitoring system that detects problems as soon as they appear has an essential role. In a large experiment, like ATLAS, the online data acquisition system is distributed across the nodes of large farms, each of them running several processes that analyse a fraction of the events. In this architecture, it is necessary to have a central process that collects all the monitoring data from the different nodes, produces full statistics histograms and analyses them. In this paper we present the design of such a system, called the gatherer. It allows to collect any monitoring object, such as histograms, from the farm nodes, from any process in the DAQ, trigger and reconstruction chain. It also adds up the statistics, if required, and processes user defined algorithms in order to analyse the monitoring data. The results are sent to a centralized display, that shows the information online, and to the archiving system, triggering alarms in case of problems. The innovation of this system is that conceptually it abstracts several underlying communication protocols, being able to talk with different processes using different protocols at the same time and, therefore, providing maximum flexibility. The software is easily adaptable to any trigger-DAQ system. The first prototype of the gathering system has been implemented for ATLAS and has been running during this year\'s combined test beam. An evaluation of this first prototype will also be presented."
}

2003

A. Anjos and J.M. Seixas. Neural particle discrimination for triggering interesting physics channels with calorimetry data. Nuclear Instruments And Methods In Physics Research A - Accelerators, Spectrometers, Detectors And Associated Equipament, 502:713–715, August 2003.
Article:
@article{nima-2003,
    author = "Anjos, A. and Seixas, J.M.",
    title = "Neural particle discrimination for triggering interesting physics channels with calorimetry data",
    journal = "Nuclear Instruments And Methods In Physics Research A - Accelerators, Spectrometers, Detectors And Associated Equipament",
    year = "2003",
    month = "August",
    volume = "502",
    OPTnumber = "",
    pages = "713--715",
    pdf = "https://www.idiap.ch/~aanjos/papers/nima-2003.pdf",
    abstract = "This article introduces a triggering scheme for high input rate processors, based on neural networks. The technique is applied to the Electron/Jet discrimination problem, present at the second level trigger of the ATLAS experiment, being constructed at CERN. The proposed solution outperforms the scheme adopted nowadays at CERN, both in discrimination efficiency and performance, becoming a candidate algorithm for implementation at the experiment."
}
The ATLAS Collaboration. An overview of algorithms for the atlas high level trigger. IEEE Transactions on Nuclear Science, 51(3 (2004)):367–374, June 2003.
Article:
@article{ieee-tns-2004,
    author = "Collaboration, The ATLAS",
    title = "An Overview of Algorithms for the ATLAS High Level Trigger",
    journal = "IEEE Transactions on Nuclear Science",
    year = "2003",
    month = "June",
    volume = "51",
    number = "3 (2004)",
    pages = "367--374",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-2004.pdf",
    abstract = "Following rigorous software design and analysis methods, an object-based architecture has been developed to derive the second- and third-level trigger decisions for the future ATLAS detector at the LHC. The functional components within this system responsible for generating elements of the trigger decisions are algorithms running within the software architecture. Relevant aspects of the architecture are reviewed along with concrete examples of specific algorithms."
}
The ATLAS Collaboration. The baseline dataflow system of the atlas trigger and daq. In 9th Workshop on Electronics for LHC Experiments. 2003.
Article:
@inproceedings{elhc-2003,
    author = "Collaboration, The ATLAS",
    title = "The baseline dataflow system of the ATLAS trigger and DAQ",
    booktitle = "9th Workshop on Electronics for LHC Experiments",
    year = "2003",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/elhc-2003.pdf",
    abstract = "In this paper the baseline design of the ATLAS High Level Trigger and Data Acquisition system with respect to the DataFlow aspects, as presented in the recently submitted ATLAS Trigger/DAQ/Controls Technical Design Report [1], is reviewed and recent results of testbed measurements and from modelling are discussed."
}
The ATLAS Collaboration. The dataflow system of the atlas trigger and daq. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article:
@inproceedings{chep-2003-4,
    author = "Collaboration, The ATLAS",
    title = "The DataFlow System of the ATLAS Trigger and DAQ",
    booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
    year = "2003",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2003-4.pdf",
    abstract = "The baseline design and implementation of the DataFlow system, to be documented in the ATLAS DAQ/HLT Technical Design Report in summer 2003, will be presented. Empahsis will be placed on the system performance and scalability based on the results from prototyping studies which have maximised the use of commercially available hardware."
}
The ATLAS Collaboration. A new implementation of the region-of-interest strategy for the atlas second level trigger. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article:
@inproceedings{chep-2003-3,
    author = "Collaboration, The ATLAS",
    title = "A New Implementation of the Region-of-Interest Strategy for the ATLAS Second Level Trigger",
    booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
    year = "2003",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2003-3.pdf",
    abstract = "Among the many challenges presented by the future ATLAS detector at the LHC are the high data taking rate and volume and the derivation of a rapid trigger decision with limited resources. To address this challenge within the ATLAS second level trigger system, a Region-of-Interest mechanism has been adopted which dramatically reduces the relevant fiducial volume necessary to be readout and processed to small regions guided by the hardware-based first level trigger. Software ha s been developed to allow fast translation between arbitrary geometric regions and identifiers of small collections of the event data. This facilitates on-demand data retrieval and collection building. The system is optimized to minimize the amount of data transferred and unnecessary building of complex objects. Detail s of the design and implementation are presented along with preliminary performance results."
}
The ATLAS Collaboration. The algorithm steering and trigger decision mechanism of the atlas high level trigger. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article:
@inproceedings{chep-2003-2,
    author = "Collaboration, The ATLAS",
    title = "The Algorithm Steering and Trigger Decision mechanism of the ATLAS High Level Trigger",
    booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
    year = "2003",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2003-2.pdf",
    abstract = "Given the extremely high output rate foreseen at LHC and the general-purpose nature of ATLAS experiment, an efficient and flexible way to select events in the High Level Trigger is needed. An extremely flexible solution is proposed that allows for early rejection of unwanted events and an easily configurable way to choose algorithms and to specify the criteria for trigger decisions. It is implemented in the standard ATLAS object-oriented software framework, Athena. The early rejection is achieved by breaking the decision process down into sequential steps. The configuration of each step defines sequences of algorithms which should be used to process the data, and \'trigger menus\' that define which physics signatures must be satisfied to continue on to the next step, and ultimately to accept the event. A navigation system has been built on top of the standard Athena transient store (StoreGate) to link the event data together in a tree-like structure. This is fundamental to the seeding mechanism, by which data from one step is presented to the next. The design makes it straightforward to utilize existing off-line reconstruction data classes and algorithms when they are suitable"
}
The ATLAS Collaboration. Experience with multi-threaded c++ applications in the atlas dataflow software. In Conference for Computing in High-Energy and Nuclear Physics. 2003.
Article:
@inproceedings{chep-2003-1,
    author = "Collaboration, The ATLAS",
    title = "Experience with multi-threaded C++ applications in the ATLAS dataflow software",
    booktitle = "Conference for Computing in High-Energy and Nuclear Physics",
    year = "2003",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/chep-2003-1.pdf",
    abstract = "The DataFlow is sub-system of the ATLAS data acquisition responsible for the reception, buffering and subsequent movement of partial and full event data to the higher level triggers: Level 2 and Event Filter. The design of the software is based on OO methodology and its implementation relies heavily on the use of posix threads and the Standard Template Library. This article presents our experience with Linux, posix threads and the Standard Template Library in the real time environment of the ATLAS data flow."
}
The ATLAS Collaboration. The atlas hlt, daq and dcs technical design report. Technical Report, CERN Publication, 2003.
Article:
@techreport{cern-tdaq-tdr-2003,
    author = "Collaboration, The ATLAS",
    title = "The ATLAS HLT, DAQ and DCS Technical Design Report",
    institution = "CERN Publication",
    year = "2003",
    OPTvolume = "",
    OPTnumber = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/cern-tdaq-tdr-2003.pdf",
    abstract = "This document contains the \'blue-print\' specifications of the Trigger/DAQ systems of ATLAS."
}
The ATLAS Collaboration. Architecture of the atlas online physics-selection software at lhc. In Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications. 2003.
Article:
@inproceedings{astro-2003,
    author = "Collaboration, The ATLAS",
    title = "Architecture of the ATLAS online physics-selection software at LHC",
    booktitle = "Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications",
    year = "2003",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/astro-2003.pdf",
    abstract = "A filtragem de eventos no experimento ATLAS é organizada em dois níveis distintos: o Segundo Nível de Filtragem e e Filtro de Eventos. Um enfoque unificado para selecionar eventos em ambos os níveis foi escolhido. Desta forma, um conjunto de rotinas de base foi projetada para maximizar o compartilhamento das interfaces e componentes offline, ainda que mantendo uma flexibilidade suficientemente grande para atender aos requisitos operacionais do Sistema de Filtragem, notavelmente aqueles relacionados ao desempenho e robustez. Este artigo descreve a arquitetura e o projeto do sistema de seleção de eventos e mostra como esta implementação é compatível com os desafios do experimento."
}

2001

André Anjos. Sistema neuronal rápido de decisão baseado em calorimetria de altas energias. PhD thesis, COPPE/UFRJ, 2001.
Article:
@phdthesis{msc-thesis-2001,
    author = "Anjos, André",
    title = "Sistema neuronal rápido de decisão baseado em calorimetria de altas energias",
    school = "COPPE/UFRJ",
    year = "2001",
    pdf = "https://www.idiap.ch/~aanjos/papers/msc-thesis-2001.pdf",
    abstract = "This work (written in portuguese) develops a fast neural classifier for high energy particle discrimination (electron/jet) at the second level trigger of ATLAS, at CERN, Switzerland. The classifier is fed by the information from one of the ATLAS detectors, the calorimeter, a highly segmented detector which measures the energy of particles with high resolution. The information is preprocessed in a clever way, by building concentric energy ring sums, which reduces significantly the input dimensionality. Despite the high information compactation rate, the designed system achieves a very high discrimination efficiency (97\\% for electrons and 95,1\\% for jets), outperforming the classical solution implemented nowadays at the second level trigger. A system implementation on a fast digital signal processor (DSP) is presented, and its performance is evaluated in both speed and accuracy."
}
A. Anjos and J.M. Seixas. Redes neurais especialistas para a separação elétron-jato usando calorímetros multi-camadas e multi-segmentados. In XXII Encontro Nacional de Física de Partículas e Campos. 2001.
Article:
@inproceedings{enfpc-2001,
    author = "Anjos, A. and Seixas, J.M.",
    title = "Redes Neurais especialistas para a separação Elétron-Jato usando Calorímetros multi-camadas e multi-segmentados",
    booktitle = "XXII Encontro Nacional de Física de Partículas e Campos",
    year = "2001",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/enfpc-2001.pdf",
    abstract = "O experimento ATLAS estará operacional no ano de 2006. O objetivo principal deste experimento é a deteção do bóson de Higgs, usando entre outros tipos de detetores, calorímetros. Um dos canais de deteção mais importantes no experimento é o de elétrons com alta energia transversa, representando de 30 a 40\\% do total das assinaturas a serem analisadas pelo Sistema de Filtragem. Jatos (de partículas) confundem-se comumente com elétrons pela forma que interagem com os calorímetros. Neste trabalho, apresentamos um sistema de discriminação elétron-jato baseado em redes neurais especialistas, utilizando os dados dos calorímetros do ATLAS. Este sistema, depois de treinado, compacta o espaço de variáveis de entrada (células dos calorímetros) em um subespaço que mantém os aspectos necessários para uma deteção eficiente de elétrons. Os resultados apresentados se mostram melhores que os resultados obtidos usando-se técnicas desenvolvidas no CERN, com o mesmo objetivo."
}

2000

André Rabello dos Anjos and José Manoel de Seixas. Mapeamento em anéis para uma separação neuronal elétron-jato usando calorímetros multi-camadas e multi-segmentados. In XIX Encontro Nacional de Física de Partículas e Campos. 2000.
Article:
@inproceedings{enfpc-2000,
    author = "dos Anjos, André Rabello and de Seixas, José Manoel",
    title = "Mapeamento em anéis para uma separação neuronal elétron-jato usando calorímetros multi-camadas e multi-segmentados",
    booktitle = "XIX Encontro Nacional de Física de Partículas e Campos",
    year = "2000",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/enfpc-2000.pdf",
    abstract = "Propõe-se neste trabalho, que a análise conduzida no Segundo Nível de Filtragem do Experimento ATLAS, no CERN, seja feita por meio de processamento neural sobre a região de interesse previamente destacada pelo Primeiro Nível nos Calorímetros. Por depender do posicionamento da RoI no detetor, os números de camadas, granularides e profundidades das células do calorímetro são desconhecidos até o momento da chegada do evento ao sistema de análise. Ainda assim, estima-se que o número de células para análise estará em torno de 1000 por RoI. As eficiências de separação obtidas, tempos de execução e uma comparação com a eficiência de outros métodos empregados para a mesma atividade são discutidas."
}

1999

André Rabello dos Anjos and José Manoel de Seixas. Integrando plataformas e algoritmos para o segundo nível de trigger do experimento atlas. In Encontro Nacional de Física de Partículas e Campos. 1999.
Article:
@inproceedings{enfpc-1999,
    author = "dos Anjos, André Rabello and de Seixas, José Manoel",
    title = "Integrando Plataformas e Algoritmos para o Segundo Nível de Trigger do Experimento ATLAS",
    booktitle = "Encontro Nacional de Física de Partículas e Campos",
    year = "1999",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/enfpc-1999.pdf",
    abstract = "Este artigo contém um sumário do trabalho realizado no âmbito dos estudos de portabilidade da infraestrutura do fluxo de dados do Sistema de Filtragem, originalmente escritos em C e operando em Sistemas Operacionais comerciais para uma implementação orientada a objetos baseada em C++ rodando sobre Linux. Ele discute as vantagens deste enfoque, tanto em termos de mantenibilidade quanto do custo final de projeto."
}

1998

J. M. Seixas, A. R. Anjos, C. B. Prado, L. P. Calôba, A. C. H. Dantas, and J. C. R. Aguiar. Neural classifiers implemented in a transputer based parallel machine. In International Meeting on Vector and Parallel Processing (VECPAR). 1998.
Article:
@inproceedings{vecpar-1998,
    author = "Seixas, J. M. and Anjos, A. R. and Prado, C. B. and Calôba, L. P. and Dantas, A. C. H. and Aguiar, J. C. R.",
    title = "Neural classifiers implemented in a transputer based parallel machine",
    booktitle = "International Meeting on Vector and Parallel Processing (VECPAR)",
    year = "1998",
    OPTvolume = "",
    OPTnumber = "",
    OPTpages = "",
    pdf = "https://www.idiap.ch/~aanjos/papers/vecpar-1998.pdf",
    abstract = "A transputer based parallel machine is used as a development platform for fast neural signal processing applications in physics and electricity. The 16 node machine houses 32-bit floating point digital signal processors running as coprocessor for the transputers, so that signal processing can be optimized. The application in physics consists in a prototype of an online validation system for a high event rate collider experiment, which is implemented using neural networks for physics process identification. In electricity, a nonintrusive load monitoring system for household appliances is developed using a neural discriminator to identify seven groups of equipment."
}
J.M. Seixas, L.P. Caloba, A.R. Anjos, B. Kastrup, A.C.H. Dantas, and R. Linhares. A neural online triggering system based on parallel processing. IEEE Transactions on Nuclear Science, 45(4):1814–1818, August 1998.
Article:
@article{ieee-tns-1998,
    author = "Seixas, J.M. and Caloba, L.P. and Anjos, A.R. and Kastrup, B. and Dantas, A.C.H. and Linhares, R.",
    title = "A neural online triggering system based on parallel processing",
    journal = "IEEE Transactions on Nuclear Science",
    year = "1998",
    month = "August",
    volume = "45",
    number = "4",
    pages = "1814--1818",
    pdf = "https://www.idiap.ch/~aanjos/papers/ieee-tns-1998.pdf",
    abstract = "The study of a prototype of the second-level triggering system for operation at LHC conditions is addressed by means of a parallel machine implementation. The 16 node transputer based machine uses a fast digital signal processor acting as a coprocessor for optimizing signal processing applications. A C-language development environment is used for running all applications at ultimate speed. The implementation is based on information supplied by four detectors and includes two phases of system operation: feature extraction and global decision. Feature extraction for calorimeters and global decision processing are performed by means of neural networks. Preprocessing and neural network parameters rest in memory and the activation function is implemented using a look up table. Simulated data for the second-level trigger operation are used for performance evaluation."
}
André Rabello dos Anjos, Augusto Dantas, and José Manoel de Seixas. Um protótipo do sistema de validação do nível 2 para as condições do lhc. In Encontro Nacional de Física de Partículas e Campos, 32–33. 1998.
Article:
@inproceedings{enfpc-1998,
    author = "dos Anjos, André Rabello and Dantas, Augusto and de Seixas, José Manoel",
    title = "Um Protótipo do Sistema de Validação do Nível 2 para as Condições do LHC",
    booktitle = "Encontro Nacional de Física de Partículas e Campos",
    year = "1998",
    OPTvolume = "",
    OPTnumber = "",
    pages = "32--33",
    pdf = "https://www.idiap.ch/~aanjos/papers/2009/01/09/sbf98.pdf",
    abstract = "O experimento ATLAS pretende comprovar a existência do bóson de Higgs. Para tal, um grande sistema de deteção e aquisição vem sendo projetado. O sistema aquisição tem a função de separar em tempo real interações originárias do decaimento de um Higgs de física ordinária. A filtragem de eventos no sistema de aquisição é concebida em três níveis, de complexidade crescente e velocidade decrescente. O segundo nível pretende utilizar redes de computadores pessoais (PC\'s) interconectados por rápidos sistemas de rede. A escolha de fabricantes, sistemas operacionais e algoritmos de processamento ainda não foi feita, mas esforços em prol desta decisão vêm sendo realizados. Neste trabalho desenvolve-se uma fração do filtro de segundo nível utilizando-se de processamento paralelo, redes neurais artificiais e DSP\'s."
}

1997

J.M. Seixas, L.P. Calôba, A.R. Anjos, A.C.H. Dantas, and R. Linhares. Fast neural decision system based on dsps and parallel processing. In International Conference on Signal Processing Applications and Technologies, San Diego, USA, 1629–1633. 1997.
Article:
@inproceedings{icspat-1997,
    author = "Seixas, J.M. and Calôba, L.P. and Anjos, A.R. and Dantas, A.C.H. and Linhares, R.",
    title = "Fast Neural Decision System Based On DSPs And Parallel Processing",
    booktitle = "International Conference on Signal Processing Applications and Technologies, San Diego, USA",
    year = "1997",
    OPTvolume = "",
    OPTnumber = "",
    pages = "1629--1633",
    pdf = "https://www.idiap.ch/~aanjos/papers/icspat-1997.pdf",
    abstract = "A prototype of an online event validation system is developed for application in a high-energy collider experiment. The system mainly uses neural networks for extracting rate events with physics significance from a huge background noise. It is based on processing the information collected from different detectors placed around the collision point. Combining a feature extraction phase for each detector with a global decision phase for final decision on discarding or not a given event, the system acts on events previously selected by a first-level analysis that reduces the event rate to 100 kHz. To cope with this input frequency, the proposed system is being emulated in a 16 node transputer base parallel machine that has a fast digital signal processor running as a co-procesor for each node."
}
André Rabello dos Anjos. Sistema de classificação baseado em uma máquina com sistema distribuído. PhD thesis, Departamento de Eletronica/UFRJ, 1997.
Article:
@phdthesis{grad-thesis-1997,
    author = "dos Anjos, André Rabello",
    title = "Sistema de classificação baseado em uma máquina com sistema distribuído",
    school = "Departamento de Eletronica/UFRJ",
    year = "1997",
    pdf = "https://www.idiap.ch/~aanjos/papers/grad-thesis-1997.pdf",
    abstract = "Na busca de novos canais físicos em experimentos com partículas colididas, sistemas de validação têm se mostrado de grande valia. Normalmente os subprodutos de colisões interparticulares representam física ordinária e conhecida enquanto que nova física aparece camuflada neste meio. A impossibilidade de gravação e análise do imenso volume de dados produzido nestes ambientes exige o uso de sistemas de validação para que se maximize o espaço de gravação e se minimize o espaço de procura de novos fenômenos. Em particular, no CERN, o par acelerador/colisionador do LHC, que estará operacional no ano de 2005, utilizará um destes sistemas de validação baseado em 3 níveis em cascata de complexidade crescente e velocidade decrescente. Este sistema tem por objetivo a análise e filtragem, em tempo real, de um volume de dados cuja taxa chega à impressionante faixa de 100.000.000 por segundo. A divisão do sistema em 3 etapas distintas visa produzir um sistema de validação o mais eficiente e dinâmico possível, sem que se sobrecarregue nenhuma das partes. Para o primeiro nível estima-se a utilização de processadores velozes, com nível baixo de programação, capazes de suportar a taxa inicial dos eventos. Para o terceiro nível o uso de pesado ambiente computacional é previsto. No segundo nível ambientes altamente programáveis serão combinados com técnicas de paralelização de aplicações para que atinjamos a taxa de processamento requerida de 100.000 eventos por segundo. Vários tipos de tecnologia estão sendo testadas em todo o mundo para que se decida, não somente sobre a arquitetura, mas, também, sobre o tipo de equipamento a ser empregado neste extenso sistema de classificação. Este trabalho é sobre a implementação em uma máquina com processamento distribuído de uma das arquiteturas previstas para o segundo nível de validação (ou classificação) do experimento ATLAS/LHC. A máquina em questão é um sistema Telmat TN310 com processamento distribuído por 16 nós padrão HTRAM totalmente conectados através de uma rede de chaves assíncronas. A arquitetura mencionada prevê a utilização de técnicas de paralelismo de dados e fluxo na obtenção de menores tempos de processamento. O objetivo final é entender se o processamento em sistemas semelhantes a uma TN310 (visamos o tipo de nó-de-processamento e o padrão de conexão entre estes) pode ser viável para o segundo nível de validação. Isto se dará através da análise e capacidade de abstração proporcionadas pelo desenvolvimento da aplicação sugerida no equipamento. Soma-se ao trabalho o desenvolvimento de uma unidade de decisões globais baseado em redes neurais. A unidade constitui processo central do sistema de validação. Resultados atingidos são expostos e discussões sobre técnicas de implementação são realizadas no decorrer da documentação."
}

1996

J.M. Seixas, L.P. Calôba, and A.R. Anjos. Particle discrimination using sub-optimal filtering techniques. In Congresso Brasileiro de Automatica (CBA), São Paulo, Brasil, 635–640. 1996.
Article:
@inproceedings{cba-1996,
    author = "Seixas, J.M. and Calôba, L.P. and Anjos, A.R.",
    title = "Particle discrimination using sub-optimal filtering techniques",
    booktitle = "Congresso Brasileiro de Automatica (CBA), São Paulo, Brasil",
    year = "1996",
    OPTvolume = "",
    OPTnumber = "",
    pages = "635--640",
    pdf = "https://www.idiap.ch/~aanjos/papers/cba-1996.pdf",
    abstract = "The discrimination of high energy electrons and pions using a scintillating fiber calorimeter is addressed. The discrimination method is based on analyzing the time structure of calorimeter signals and achieves discrimination response smaller than 100 ns. Signals pass through a high performance constant fraction discriminator and events that lie in the confusion region of this discriminator are analyzed through a sub-optimal filltering technique based on pulse integration. The composed discrimination system achieves 98\\% electron eficiency with less than 0.1\\% of pions being misclassified as electrons."
}