Peak1014 is a group of data-minded professionals who advocate for more statistical rigor and the prevention of unfair biases in data science projects.
We started cooperating when we realized that there is space for improving how compaines implement data science: the machine learning applications and artifical intelligence approaches are only as good as the underlying statistical model.
We would like models to be statistically sound but also fair: they should not take into account sensitive characteristics of the subjects unless it is really required to do so. People should also be able to understand what data about them is used and how those affect a decision about them.
Data Science is not location-bound. Therefore our team is not, either. We currently have members in Amsterdam, The Netherlands, and Budapest, Hungary and are planning to branch out to other countries as well.
We facilitate workshops that bring your organisation up-to-date about the relevancy and current state of algorithmic fairness, transparency and privacy. See our workshop topics here.
We can help your organisation set guidelines and policies to ensure fairness in the way you work.
We can help you understand whether your algorithms work as intended and do what you think they do. If they are in order, we can check whether they treat their subjects fairly and equally – and if something is broken, we can help fix it.
Are parking enforcement officers ticketing cheap cars with the same likelihood as luxury cars? There could be (unintentional) biases in the way your employees behave – but we can help spot this in your data.
Some organisations struggle with unintentional biases in their hiring or promotion processes. Some have tried to use machine learning to overcome this – but did not find the fairness they were aiming for. We can help you with this.