- AI Engineering
- Continuous Architecture
- IoTArch: Improving the Design and Realization of Situational Aware Internet of Things Systems for Emergency Situations Handling
- Managing Model Inconsistencies
- Model-based development and continuous integration
- Closing the Safety-Security gap in software intensive systems
- Evolution support for architectural artefacts
- Managing Architectural Technical Debt
- Managing Interoperability Concerns in Large Systems
- End-to-end Variability Management
- Ensuring Quality of Service through Modeling of Resource Requirements and Service-level Agreements in Industrial IoT
- Managing Interoperability Concerns in Large Systems
- Managing Practices for Development Speed
- Scaling Agile development in mechatronics organizations
- Continuous Delivery
- Continuous Delivery
- An Analysis of Team-based Development within an Activity Based Working Environment
- Aspects of Automated Testing
- Call for participation in an investigation in Continuous Integration Visualization
- Modeling and Analyzing Collaborating Machines
- Modeling and Analyzing Event-based Autonomous Systems
- Data Visualization for Continuous Integration
- Enterprise Scale Continuous Integration and Delivery
- Customer Data- and Ecosystem-Driven Development
- Customer Data- and Ecosystem-Driven Development
- Development Metrics
- 1. Metrics project
- 2. Stakeholder communication
- 3. MicroHRV: Recognizing Rare Events in Microwave Radio Links and Intensive Care Units using Machine Learning
- 4. T4AI – Transforming Software Architectures for AI
- 5. DEVELOP – Design, Verification and Validation of ML systems in automotive
- Cybersecurity Hackathon and Design Jam @ Software Center Reporting workshop
- Industrial impact of Rendex – requirements quality tool
- QuaSAR@car
- RAWFP – Resource aware functional programming
- Size and quality between software development approaches
- VISEE
- Workshop on Software Metrics and Measurements as Foundations of Big Data, Software Analytics and Machine Learning
- Metrics
Vision
All Software Center companies have efficient product development, release and deployment processes.
Mission
We help the companies to design and develop modern measurement methods and tools by utilizing state-of-the-art analytics, AI and machine learning.
We use Action Research to increase the impact and adoption of the results (Action Research in Software Engineering), i.e., we work on-site of the companies.
Over the course of ten years of our collaboration, our theme has resulted in over 50 models and tools. We have also published over 200 papers and books that disseminate the results to the public domain.
Examples of the metrics designed and introduced to the companies:
- Release readiness: measuring the number of weeks that the product development team needs to release the product (Agile): Release Readiness Indicator for Mature Agile and Lean Software Development Projects | SpringerLink
- Change waves: measuring the impact of a change on software product: Identifying Implicit Architectural Dependencies Using Measures of Source Code Change Waves | IEEE Conference Publication | IEEE Xplore
- Defect inflow: predicting the number of defects that the development team needs to handle in the coming weeks: Predicting weekly defect inflow in large software projects based on project planning and test status - ScienceDirect
- Code quality: measuring and improving the impact of coding practices on software quality: Recognizing lines of code violating company-specific coding guidelines using machine learning | SpringerLink
- Engineering level: measuring the quality of code in a git repository: PHANTOM: Curating GitHub for engineered software projects using time-series clustering (springer.com)
- SimSAX project similarity: measuring the similarity of projects, for example to monitor the process evolution: LegacyPro—A DNA-Inspired Method for Identifying Process Legacies in Software Development Organizations | IEEE Journals & Magazine | IEEE Xplore, and Simsax: A measure of project similarity based on symbolic approximation method and software defect inflow - ScienceDirect
- MeTEAM: measuring the maturity of software metric teams: MeTeaM—A method for characterizing mature software metrics teams - ScienceDirect
- MESRAM: measuring the quality and quantity of measurement programs: MeSRAM – A method for assessing robustness of measurement programs in large software development organizations and its industrial evaluation - ScienceDirect
Projects
- Continuous Product and Organizational Performance
- Stakeholder Communication
- Associated: MicroHRV
- Associated: T4AI
- Associated: Develop
- Finished: Quasar@Car - Quantifying meta-model changes
- Finished: VISEE - Verification and Validation of ISO 26262 requirements at the complete EE system level
- Finished: Longitudinal Measurement of Agility and Group Development
- Finished: Size and Quality between Software Development Approaches
- Finished: RAWFP - Resource Aware Functional Programming
Metrics blog
- Mitigating the impact of mislabeled data on deep predictive models: an empirical study of learning with noise approaches in software engineering tasks April 8, 2024Mitigating the impact of mislabeled data on deep predictive models: an empirical study of learning with noise approaches in software engineering tasks | Automated Software Engineering (springer.com) Labelling data, annotating images or text is a really tedious work. I don’t do it a lot, but when I do it, it takes time. This paper presents […]Miroslaw Staron
- Sketches to models… January 28, 2024https://www.computer.org/csdl/proceedings-article/models/2023/248000a173/1SOLExN0XaU It’s been a while since I worked with models and I looked a bit at how things have evolved. As I remember, one of the major problems with modelling was one of its broken promises – simplicity. The whole idea with modelling was to be able to sketch things, discuss candidate solutions and then […]Miroslaw Staron
- Modelling digital twins… January 21, 2024https://www.computer.org/csdl/proceedings-article/models/2023/248000a013/1SOLEPphpHa Digital twins are becoming increasingly important. They provide a possibility to monitor their real twin without the need for costly measurements and sending technicians to the site where the real twin is located. However, development of them is not so easy and is almost one-off for every twin pair. The paper “A Model-driven Approach […]Miroslaw Staron
- Generating documentation from notebooks December 15, 2023https://github.com/jyothivedurada/jyothivedurada.github.io/blob/main/papers/Cell2Doc.pdf Understanding code is the same regardless if it is in a Jupyter notebook or if it is in another editor. Comments and documentation is the key. I try to teach that to my students and, some of them at least, appreciate it. Here is a paper that can change this to the better without […]Miroslaw Staron
- Log files and anomalies, once again… December 8, 2023https://arxiv.org/pdf/2308.09324.pdf I’ve written about log files a while back, but I think I’m getting hooked up onto the topic. It is actually quite interesting how to use it in practice. So, here is one more paper from the ASE 2023 conference. This paper presents a new way to create log data that can help spot […]Miroslaw Staron
Theme 3, Leader: Miroslaw Staron
Professor, Software Engineering division, Department of Computer Science and Engineering, University of Gothenburg
More information
Miroslaw.Staron@cse.gu.se
Phone: +46 31 772 10 81