New Post: Parler Tricks: Making Software Disappear

and goes on even under GDPR.

Yes. GDPR is only one step - not very drastic. It is mainly about informing people that their data is collected.

not at all, this is just a side effect to end users.

I was thinking of the user effect. I know it had effects on the provider side - at least they had to think about data protection in a new way. The real influence of the directive is still to be seen. The small and honest got more work but how about those big and dishonest ? Internet is complicated and it is difficult to handle.

As with any law. It’s all about risk management :slight_smile: If business assesses the risk (and cost impact) of fine is lower than profit - they take it. But in general (in my practice) I see that it works. It’s not yet reflectory actioned as with other security risks and controls (eg quite common to hear ‘Oh shit’ to question ‘what about gdpr’ in the design workshops) but it’s for sure one of the mandatory controls in majority of p3m3 documents I’m working with.
As for how gdpr works in practice - you can take a look at this report.

Thanks for the link. For once it was clearly presented facts.
But I am more concerned about apps gathering data. For example an app handling the torch of the phone can request access to all data you have - why ? Obviously to collect and sell it. This should be prohibited. Of course one can put the blame on the user: “Do not use such apps” but it is almost impossible for a normal person to know what permissions are needed for the function. Open software is more safe but you have to rely on the community - that someone finds the problematic features.
I most certainly NOT suggest any mandatory verification but some voluntary verification system could be useful - at least for those anxious who feel they have no competence to assess the risks.

1 Like

One cause for concern is the amount of manpower the European Commission is calling for. According to recent documents, the task group that will lead DMA enforcement is to be composed of 80 individuals, which seems woefully inadequate given their task’s scope and complexity. Further, the Commission suggests creating this team only after the DMA is enacted, meaning they could be stuck playing catch up.

more regulation it seems … i’d rather those funds went to SPCs like Purism and independent activists rather than risk corruption (again)

We’re talking about different problems.

Sounds like you’re agreeing that there should be nothing to compel google to host and distribute my application. (meaning I cannot “force” them to list it in their store)
and saying that there should be further restrictions.

I.e governments should be able to demand that google/apple remove applications from their stores.
So rather than me having complete ability to control what I have on my device, that supports limiting the distribution chain of applications towards me.

Also, I’m not specifically against any “we can do whatever we want” clauses, - but I am for those being highlighted, that’s informed consent.
and with informed consent I can decide what I believe the value of my personal data is. (which for me seems more valuable than a government saying you cannot have access to this because they want data that you consider to be a fair trade.) - but it does depend on whether you understand the value of the data.

Of course one cannot “force” Google or Apple to put an app into their shops but I would like to give users more information about what the apps really gather. Not just “we have checked the apps”.

The big problem with “informed consent” is that 90 % of the users are not informed at all. They have not a clue about data security and what their data is used for. Only when there is a really big scandal they start thinking about these things - usually that is too late. Sorry, but I am not very optimistic about users handling (or even caring) about these problems.

1 Like

For sure, (people, including me, are broadly dumb and perhaps shouldn’t be allowed to make these choices for themselves) but you can see why I have little time for the arguments given in the blog.

Android = bad because there is only 1 app store (just not true)
google = bad for penalizing other app stores (again not true)
google = bad because they stop you installing random 3rd party apps that do bad things (unless you go outside the store)
Librum5 = good because it ships with an (1) app store.
Purism = good because it doesn’t penalize other app stores (you just have to find/trust/add them!)
Purism = good because they have carefully curated apps to avoid apps that do bad things.(unless you go outside their store.)

Now, do I trust Purism to act in better faith that Google? - sure. right now.
But that’s not a argument focused on when the device is better, that’s focused on why I think big tech is less trustworthy than smaller providers with more pure missions. (and my answer may change when/if purism become a very large player in the mobile game. - i.e when they become big tech.)

It really sounds like you’re arguing against the blog posts freedom statements saying it is good that google control what 90% of people can see, because they aren’t sufficiently informed to know what to install anyway. AND that there are some applications that google would allow (with appropriate warnings) that you think should be banned by law.

And that’s a crazy thing, because I do understand the argument, better data security can be achieved with less freedoms, (i.e remove the freedom for companies to trick you out of your data, remove the freedom for the user to decide the value of their data vs the service that they receive for giving it up and there is a lot more security.)

As far as arguments go, it’s bullet proof, - it’s just not one I ethically subscribe to.

I am afraid you misunderstand. Apps gathering data without consent should be illegal in the same way as it is illegal to sell products by claiming they do things they don’t.

2 Likes

I don’t think I have misunderstood.
I’ve written that there are apps that gather personal data (with appropriate warnings) (i.e good)
and spoke of companies that try to Trick you into giving them data (i.e. not a fair exchange.) (i.e bad)

So I understand that there are applications that give consent, and applications that don’t - and actually Android is pretty good at asking/telling you this app requires access to your phonebook, text messages email and GPS position… - that’s a pretty poor return to the user for a free torch application.
On the other hand, you might decide that a company knowing where you are at all times (GPS data) is a great trade off for them sending you coupon codes to utilize in shops in areas where you are. (e.g. you’re in the high street at lunch time and get a code for money off Starbucks.)

It’d be interesting to see if PureOS gives these kinds of warnings also.

Selling laws aren’t applicable if you’re not selling something - i.e listing a free torch application.

So what you’re really saying is the government should make a new law that would govern what software may be sold (legally) by a private enterprise.
The government should be able to tell google that they are not allowed to list applications in their play store (i.e the government decides what software I’m allowed to install to my phone.) the government can make applications illegal and prosecute anybody who distributes them (that is the mechanism for forcing google to remove apps, and it is the mechanism that would be used for forcing purism to remove apps.)

That’s not freedom. Security yes, freedom no.

Purism’s Social Purpose Corporation charter should prevent Purism from ever becoming a company that ever acts like Google or the other tech giants. Look at the text of the charter:

2.2 In addition, the Corporation is organized for the following purposes (collectively referred to as “Specific Social Purpose):

  • The Corporation will prioritize privacy, security, and freedom for its customers. The Corporation will place respecting users’ rights to privacy, security, and freedom at the forefront of its mission.
  • The Corporation will only use and distribute free/libre and open source software in the kernel, OS, and software in its products. Free/Libre and Open Source Software is software that respects users’ freedom. Non-free, or proprietary, software and installable firmware within the kernel will be strictly prohibited within the Corporation. The Corporation’s operating system and kernel and all software will be “free” according to the strictest of guidelines set forth by the Free Software Foundation’s Free Software Definition.
  • The Corporation will design and manufacture hardware that respects users’ rights to privacy, security, and freedom. The Corporation will use hardware and software that respects users’ rights. Non-free, or proprietary, chipsets that require installable firmware binaries into the kernel will be strictly prohibited within the Corporation. If a suitable component part that fully respects these rights is not available in the marketplace, the Corporation may use a part in its products that does not meet this standard if it is necessary for the product to be fit for purpose, in which case the Corporation will: (1) provide purchasers of the product, in writing, with strong evidence that a free version of the part with equivalent specifications is not available and that developing a free version of such would not be feasible at that point in time; and (2) actively pursue the development of a free version of the part for its future products.
  • The Corporation will not discriminate against individuals, groups or fields of endeavor. The Corporation will allow any person, or any group of persons, in any field of endeavor to use its systems for whatever purpose.
  • The Corporation will source, and manufacture the highest quality hardware. The Corporation will endeavor to source the best component parts that operate using free/libre and open source software. When considering the selection of parts, The Corporation will weigh such issues as privacy, security, freedom, ethical working conditions, environmental impact, and performance, among other factors.
  • The Corporation will release all software written by The Corporation under a free software license.
  • The Corporation will release all hardware schematics authored by The Corporation under a free hardware license.
  • The Corporation will release encryption tools and services and will design these tools such that The Corporation will have no means to access users’ encrypted data.

2.3 The mission of this social purpose corporation is not necessarily compatible with, and may be contrary to, maximizing profits and earnings for shareholders, or maximizing shareholder value in any sale, merger, acquisition, or other similar actions of the Corporation.

What this means is that if Purism violates any of these clauses, any shareholder can sue the company. It means that it will be very hard for anyone to take over the company and start operating it like Google, because the new management of the company could be sued. This is basically a warning to any investor that Purism won’t try to maximize its profits if it means violating one of these clauses, so shareholders who want that should stay away from the company.

These are pretty good guarantees that Purism will always act ethically in terms of protecting the privacy of their users, and will never turn into a tech company that monetizes user’s personal information.

2 Likes

It is forbidden to open letters and read them. But on the digital side it is usually allowed to steal information from anyone. I want a kind of data secrecy also in the digital domain. I want the privacy for paper letters retained even if the medium is digital.
If you give your consent you can of course download and install any snooper application. And I am not talking about listing. Google can list any bad app they want - as long as it is clear what it does. Independent verification is just to tell the user that one app is not doing something that is not clearly told in the description.
Freedom for whom ? Freedom to steal data from the user ? Or freedom for the user to trust that an app is not stealing his data (unless it is clearly stated).

1 Like

I remember when Google used to say “do no evil”.
I remember when Jack Dorsey testified to Congress under oath that they would not block a leader of the world from participating on their platform (that sunlight was the best disinfectant) and that this was why they hadn’t removed the Ayatollah despite a constant stream of anti Semitic posts coming from that account.

Charters can be changed - and one day it might make sense for Purism to change their charters in response to any external, or internal pressures. (just as google did, just as Twitter did etc)

you can’t guard against that, - As I said, I trust them now… (but that could change.)

yeah… I mean except history shows that may not always be the case.

The initial Librum one (privacy suit) shipped with trackers that the company didn’t even know about since they literally just took someone else’s code and put their own name on it…

2 Likes

Freedom for me to install whatever app I like from whatever supplier I choose to trust.

Now, of course that doesn’t mean that applications should act in nefarious ways. (and this is how the law currently is.)
the law doesn’t stop the app taking data it doesn’t need, the law doesn’t stop the company selling said data. the law stops the company that makes the app and sells the data from misrepresenting (lying) about what they are doing, and penalizes them (with fines) if they do.

For GDPR the fines are set significantly high enough for most companies to want to comply. - so everything is managed under current laws - there is no need to call an app illegal, because it isn’t the app that is illegal, it’s the actions of a company illegally collecting and selling data that is illegal.

There is a very big difference between saying “don’t be evil” as an internal corporate marketing strategy, and going to the trouble we have to encapsulate our values into a binding corporate charter. We did that not because the community demanded it (the community didn’t know about SPC until we became one), but because we truly believe in these values and specifically wanted to prevent the possibility of an outside investor forcing us to violate our ethics and sue us for not “maximizing shareholder value.”

No, we forked a well-known client and made changes not simply to put our own name on it, but because we needed to make it more convenient than the traditional client. The goal, like with the Librem Mail client and Librem Social, was to make it so a user could just enter their Librem One username and password and it would “just work”, without their also having to enter in server information and deal with other complexities. You have to realize we are trying to compete with Google and other all-in-one service suites that are incredibly convenient for the user–one login works across multiple applications without the user having to enter server and other information.

Writing yet another Matrix client would have been a waste of our resources and reinventing the wheel, when all we wanted was to provide people an easy way to log into Librem Chat with their username and password.

We removed the trackers we knew about but there was still latent (possibly unused) code that was flagged with a different scanning tool, and once we became aware of it, we removed it. So much of the phone app ecosystem is dedicated to spying that it’s really difficult, even if you use free software applications on those platforms sometimes, to identify and remove all the trackers on existing code.

5 Likes

I know how hard it would be to write the complete suite from scratch, my criticism isn’t that existing code was forked, rebranded and re-released.

Whilst I couldn’t have done better myself, This is an example of how even with charters, even with a team of developers, even with a promise that Purism vet all the source code for apps that are released…
“features” that don’t respect privacy still were released! - like you promised would never happen.

Treating that as a broken promise is unreasonable, just like if we released software that later was found to have a security bug which we then patched upon discovery, we wouldn’t be breaking a promise to prioritize the security of our customers.

Our corporate charter states our values and what our priorities are. It does not promise we won’t make mistakes as we work to achieve our goals according to our values. I certainly hope you don’t actually think it does, and are instead doing the typical “devil’s advocate” forum discussion thing.

3 Likes

I’m not treating it like a broken promise. (that was a poor choice in words.)
There is a huge difference between an accidental oversight a couple of years ago and a malicious action (or inaction.)
The article literally says “we’re putting in new systems to aid in detecting things like this in future”

Two posts ago I wrote “yes I trust Purism to be able to deliver this promise”

My only point about the charter was that things change. - and that can be for a variety of reasons.
People have spent most of this year literally threatening the lives of people/spouses/co-workers/pets of people they politically disagree with… (whether that is the Parler CEO, or that woman in charge of certifying votes in Georgia)
All I’m saying is there are more pressures than business pressures once a company reaches a certain size, or certain audience…
If you had a genuine and credible threat on the life of yourself or employees to “pull” an app from your store, I would hope that the response would be nuts to the charter, pull the app, call the FBI and hope to re-instate the app in future. (it’s rhetorical, hyperbolic and not even a question.) the point is that life can develop to be much more complicated than anyone can ever imagine… the charter is a great guide, but it’s no stone tablet!