At Poli we do not provide financial incentives for users to share their opinions or data; we ask them to do so because they think it is interesting, fun and worthwhile. However, we understand that asking for data carries with it risks and responsibilities. Part of my role as a Non-Executive Director is to ensure that when key decisions need to be made we put our ethics first. Here I set out in detail why it is important to Poli that we are responsible and ethical in the way we treat data and people, and what that means in practical terms.

We have no business without user trust. In the noughties, when Facebook and Google grew into the global giants they now are, people gave their data freely and often unthinkingly. Since then Edward Snowden has shown us how governments co-opt our data to their ends without our permission or knowledge, and Cambridge Analytica have claimed to have used Facebook data to put Trump in the White House. Users are wiser and warier than they were, and if they do not trust us they will vote with their feet. Without trust we will have no data, and we will have no business.

As board members, as a company, and as people, it is important to us that we hold ourselves to a high ethical standard. We are currently in the process of setting out the principles that will allow us to shape a business that puts ethics front and centre. We will share these in our blog. The preciousness of user data, and the sacrosanct nature of user trust will be at the core of these.

We must comply with existing and upcoming data legislation. We are subject to UK and European legislation, including GDPR, which presents interesting technical challenges and product decisions around collecting data anonymously and the right to be forgotten.

We anticipate - and would welcome - greater regulation and oversight in technology. There are parallels between the financial services industry in the mid to late 2000s and technology now: a period of exponential growth with limited oversight and understanding of what banks were doing. We all know where that led to. A recent Harvard Harris poll showed that Americans are “evenly split” around the need for regulation of technology companies. Financial Services companies in the UK are held to account against various sets of regulations - including Treating Customers Fairly. It is not hard to imagine a similar framework in technology, and we are working on the basis that at some point we may be asked to account for our actions.

So how have we responded to these needs?

We are building our product to be compliant with GDPR. GDPR was on the horizon when we started work 18 months ago and it will be baked into everything we do. For instance, even though we are collecting data anonymously, the demographic data and opinions we collect from users could be classified by GDPR as sensitive. For that reason we are building our product in a way that makes it easy and intuitive to give - and withdraw - consent to use that data.

We collect user opinions and personal data anonymously. We receive our data via Facebook Messenger with a unique, anonymised ID, but we do not match that data to individuals. If a user provides all their demographic data we know a lot about them, but we will not know who they are: their name, their email address or postal address. However, collecting data through Facebook does have some consequences in terms of privacy.

  • There is a Facebook API that could give us basic user profile information. We will not use this API, or any other Facebook tool that allows us to identify users.
  • All user data flows through Facebook, and this data is subject to Facebook’s own data and privacy policies.

We have ethical oversight built into our company governance. At present I am part of the Poli board as Non-Executive Director with specific responsibility for ethical oversight and people. As a member of the British Psychological Society, and Registered Chartered Psychologist I am bound by a code of conduct, and have a duty of care to users who share their data. I also have expertise in evaluating leaders in the Financial Services industry around their suitability - experience and capability - to operate in regulated roles. As mentioned, we are developing and will share shortly the principles of conduct against which we expect to be held to account. This model of oversight is right for us now, but as we grow we will adjust the model, and share it openly.

We know that in order to build a product that is valuable to our customers and users we are asking for something precious. And as Nir Eyal remarks in his book Hooked, “it isn’t a super power unless it can be used for evil”. Our aim is bigger than just “not being evil”: it is to create a product and culture that is a model for the next generation of modern, responsible and ethical tech companies.