Corporate Surveillance - Privacy-invasive Apps and Services - Spying on Unsuspecting Users - Smart Devices

12 Considerations for Opening Your Next Account – Part 1

Intro

Description

Have you ever opened an account and regretted it? Has your private data been used, abused, and breached? We have years of experience fighting with hundreds, if not thousands of different apps and services to harden, sanitize, delete sensitive information, and recover from data breaches.

In this episode, we share our insights from these experiences so you can hopefully avoid the pitfalls we and our clients have encountered over the years. More specifically, here in Part 1 of 2, we discuss:

  1. Avoiding data silo mentality
  2. Your account and data can be taken away at any time
  3. Policies can change against you at any time
  4. If a system can ID you, assume this is permanent
  5. Apps/services making it difficult or impossible to change or delete data
  6. Some systems will disable your account when you change your data

Take these considerations seriously when you consider opening your next account and stay tuned for Part 2.

Notes

  1. The images presented in the video and blog content for this episode are AI-generated and provided for entertainment purposes only
  2. MIT Technology Review (external) articles referenced in this episode:
    1. A Roomba Recorded a Woman on the Toilet. How Did Screenshots End up on Facebook?
    2. Roomba Testers Feel Misled After Intimate Images Ended up on Facebook

Podcast

1
00:00:00,000 –> 00:00:14,200
Hey everybody, welcome to the Bigger Insights Privacy & Security podcast.

2
00:00:14,200 –> 00:00:21,080
This episode is Part 1 of 12 Considerations for Opening Your Next Account.

3
00:00:21,080 –> 00:00:25,400
There’s going to be two parts to this because there’s a lot of information here and we don’t

4
00:00:25,400 –> 00:00:28,520
want to try to cram all the into one episode.

5
00:00:28,520 –> 00:00:32,320
So make sure you subscribe and stay tuned for Part 2.

6
00:00:32,320 –> 00:00:38,240
When we say account, we really just mean, you know, using any app or service that requires

7
00:00:38,240 –> 00:00:43,920
registration or providing any kind of personally-identifiable information (PII).

8
00:00:43,920 –> 00:00:49,880
Our motivation for making this episode is that we have extensive experience opening,

9
00:00:49,880 –> 00:00:54,800
sanitizing, and closing accounts for ourselves and our clients.

10
00:00:54,800 –> 00:01:01,640
We’ve encountered many, many headaches, disturbing practices and data breaches along the way.

11
00:01:01,640 –> 00:01:06,840
So we hope that this episode will help you avoid these issues going forward.

12
00:01:06,840 –> 00:01:12,880
If you’ve ever made an account and regretted it or handed over some information that maybe

13
00:01:12,880 –> 00:01:18,960
you wanted to take back or you’ve been involved in a data breach, you probably know how this

14
00:01:18,960 –> 00:01:20,400
feels.

15
00:01:20,400 –> 00:01:25,120
And even if you think that that’s not the case for you for whatever reason, just be

16
00:01:25,120 –> 00:01:31,160
aware that your information and accounts have probably been breached many, many times without

17
00:01:31,160 –> 00:01:32,640
your knowledge.

18
00:01:32,640 –> 00:01:38,200
We may go into more detail about this in a future episode, but research is showing that

19
00:01:38,200 –> 00:01:43,600
many companies, especially smaller ones, are suffering from data breaches, but just not

20
00:01:43,600 –> 00:01:45,280
disclosing them.

21
00:01:45,280 –> 00:01:50,380
And along those lines, we think everyone should take this material seriously because often

22
00:01:50,380 –> 00:01:55,880
times when you open an account, that comes with irreversible side effects.

23
00:01:55,880 –> 00:02:00,920
You know, once you give a company like Google or Fecesbook or whomever, some of your private

24
00:02:00,920 –> 00:02:06,400
information like a photo or something, we have reasonable suspicion that you’ll never

25
00:02:06,400 –> 00:02:11,000
be able to take that back – at least some of the metadata that’s associated with the data

26
00:02:11,000 –> 00:02:12,000
that you’ve given them.

27
00:02:12,000 –> 00:02:13,000
All right.

28
00:02:13,000 –> 00:02:17,720
So now we’re going to get into the first six of the 12 considerations, but just keep in

29
00:02:17,720 –> 00:02:20,840
mind that they’re not in any particular order.

30
00:02:20,840 –> 00:02:25,560
The first is: Avoiding data silo mentality.

31
00:02:25,560 –> 00:02:31,840
And what this is, is the false belief that you can control your data by being selective

32
00:02:31,840 –> 00:02:36,240
about what information you expose to each app or service.

33
00:02:36,240 –> 00:02:41,160
Believe it or not, a lot of developers and service providers out there share data with

34
00:02:41,160 –> 00:02:42,560
each other.

35
00:02:42,560 –> 00:02:48,720
This can apply to paid services, but is especially problematic for free ones.

36
00:02:48,720 –> 00:02:54,280
There are some good respected free apps out there that don’t collect and share your information,

37
00:02:54,280 –> 00:02:59,920
but most free apps and services are funded by monetizing your information.

38
00:02:59,920 –> 00:03:06,320
So next time you see some free dating app or some app where you scan your receipts and

39
00:03:06,320 –> 00:03:11,040
send them somewhere and they give you a coupon or something, just keep this in mind.

40
00:03:11,040 –> 00:03:15,520
They function by monetizing your private data.

41
00:03:15,520 –> 00:03:20,040
And obviously, once they do that, it gets passed and shared around and there’s no taking

42
00:03:20,040 –> 00:03:21,040
that back.

43
00:03:21,040 –> 00:03:26,680
And we’re of the opinion that most people know that to some degree, but we think that

44
00:03:26,680 –> 00:03:32,840
the critical piece that’s missing is a lack of understanding as to how bad this problem

45
00:03:32,840 –> 00:03:34,560
really is.

46
00:03:34,560 –> 00:03:40,920
There are many data brokers out there that know virtually everything about you: Contact

47
00:03:40,920 –> 00:03:46,680
information, family and friend associations and social graph, political and religious

48
00:03:46,680 –> 00:03:52,640
beliefs, health information, what medications you’re taking, the devices you own, how much

49
00:03:52,640 –> 00:03:59,840
money you make, what your assets are, historical and real-time location data, purchase history,

50
00:03:59,840 –> 00:04:02,480
browsing history, hobbies and interest.

51
00:04:02,480 –> 00:04:04,560
I can go on forever, baby!

52
00:04:04,560 –> 00:04:11,160
But seriously, if you wanted to find, for example, a pregnant Hispanic woman with two

53
00:04:11,160 –> 00:04:17,600
kids, who takes Zoloft and has type two diabetes and likes watching Dancing with the Stars,

54
00:04:17,600 –> 00:04:20,840
these companies could do that for you like it’s nothing.

55
00:04:20,840 –> 00:04:25,280
So if you’re like us and you don’t like the sound of that, you really need to start thinking

56
00:04:25,280 –> 00:04:31,120
twice about which apps and services that you use. Along these lines,

57
00:04:31,120 –> 00:04:36,560
make sure you’re aware of who owns an app or service that you want to use.

58
00:04:36,560 –> 00:04:43,200
We’ve seen videos where people will sit there and bash Fecesbook and then say, “Oh yeah,

59
00:04:43,200 –> 00:04:45,880
I moved to Instagram.”

60
00:04:45,880 –> 00:04:48,560
Instagram is owned by Fecesbook.

61
00:04:48,560 –> 00:04:54,160
So if you have a problem with Fecesbook or Mark Zuckerberg or whomever, you’re not really

62
00:04:54,160 –> 00:04:57,080
doing yourself any favors by moving to Instagram.

63
00:04:57,080 –> 00:04:59,160
It’s the same company.

64
00:04:59,160 –> 00:05:00,660
Same thing with WhatsApp.

65
00:05:00,660 –> 00:05:03,160
Fecesbook owns that as well.

66
00:05:03,160 –> 00:05:07,240
You also need to be careful about where you place your trust.

67
00:05:07,240 –> 00:05:13,240
We prefer apps and services that collect little or no personally-identifiable information so

68
00:05:13,240 –> 00:05:16,280
that we don’t have to trust them.

69
00:05:16,280 –> 00:05:21,720
Keep in mind that any app or service that has your information is liable to being bought

70
00:05:21,720 –> 00:05:22,720
in the future.

71
00:05:22,720 –> 00:05:26,080
And with that purchase comes your data.

72
00:05:26,080 –> 00:05:31,680
So you might trust some smaller company with your health and fitness data today, but you

73
00:05:31,680 –> 00:05:37,280
need to consider that that company and your data may end up in the hands of a company

74
00:05:37,280 –> 00:05:39,960
like Fecesbook in the future.

75
00:05:39,960 –> 00:05:42,680
That’s just how these creepy companies roll.

76
00:05:42,680 –> 00:05:47,480
If they want your data and they don’t think that you’re going to willingly hand it over

77
00:05:47,480 –> 00:05:52,840
to them, they’ll just buy an app or service that does have that data.

78
00:05:52,840 –> 00:05:58,280
So obviously, if you don’t give that data up in the first place, it’s not for sale.

79
00:05:58,280 –> 00:06:02,760
Now, let me give you some examples to help drive this point home.

80
00:06:02,760 –> 00:06:10,600
It’s been recently discovered that PayPal shares user data with over 600 companies.

81
00:06:10,600 –> 00:06:13,760
That’s not a typo: 600.

82
00:06:13,760 –> 00:06:17,640
We’ve discovered this about PayPal and many other companies.

83
00:06:17,640 –> 00:06:21,720
A lot of them actually share your data with over 800 companies.

84
00:06:21,720 –> 00:06:26,080
But when you go to certain websites and they pop up those little cookie banners, that basically

85
00:06:26,080 –> 00:06:28,960
say “We value your privacy.”

86
00:06:28,960 –> 00:06:33,360
And then they go on to say how they basically share your private information with hundreds

87
00:06:33,360 –> 00:06:40,680
of companies called like AdMob and A Million Ads and all these other creepy companies.

88
00:06:40,680 –> 00:06:45,440
If you have the time, we encourage you to kind of click through those and see what they’re

89
00:06:45,440 –> 00:06:47,120
talking about.

90
00:06:47,120 –> 00:06:54,280
Some of them will have a link or a tab where they list all of the “partners” in quotes that

91
00:06:54,280 –> 00:06:56,760
they’ll share your information with.

92
00:06:56,760 –> 00:07:00,560
It’ll shock you how many “partners” these companies deal with.

93
00:07:00,560 –> 00:07:04,840
And if you don’t believe me, go to our Twitter profile,

94
00:07:04,840 –> 00:07:06,720
it’s @BiggerInsights.

95
00:07:06,720 –> 00:07:14,480
And we posted a like a two-minute video, I think like a few weeks ago where we showed

96
00:07:14,480 –> 00:07:21,680
all of the companies that the website howtogeek.com shares your data with.

97
00:07:21,680 –> 00:07:28,240
And one of the things that’s funny about that video is I know that Twitter has like a two

98
00:07:28,240 –> 00:07:32,720
minute and 20 second limit on videos or something like that.

99
00:07:32,720 –> 00:07:38,840
So I was watching a timer as I was recording this video, because there were so many companies

100
00:07:38,840 –> 00:07:44,160
in this list that I started to realize that I was running out of time so you can see me

101
00:07:44,160 –> 00:07:49,280
like panicking, scrolling faster and faster and faster, trying to show all of these companies

102
00:07:49,280 –> 00:07:52,000
before I ran out of my two minute limit.

103
00:07:52,000 –> 00:07:56,600
You know, that’s how bad this problem is, and that’s on one website.

104
00:07:56,600 –> 00:08:02,480
I mean, imagine if you go to 100 websites, you know, over the course of a week or something,

105
00:08:02,480 –> 00:08:09,440
and each of those is sharing your information with 600, 700 or 800 other companies.

106
00:08:09,440 –> 00:08:14,480
It’s insane, it’s just mind boggling how much of this information sharing is going on.

107
00:08:14,480 –> 00:08:19,160
And very few people have any idea that this is what’s going on.

108
00:08:19,160 –> 00:08:25,520
We also discovered that Trello, which is a popular app made by Atlassian, shares a lot

109
00:08:25,520 –> 00:08:27,960
of data with Fecesbook.

110
00:08:27,960 –> 00:08:31,520
We might cover that in more detail in a future episode.

111
00:08:31,520 –> 00:08:39,080
It’s also come to light that a very popular app called Life360 was selling precise location

112
00:08:39,080 –> 00:08:46,200
data to 12 different data brokers, including the Department of Defense (DoD).

113
00:08:46,200 –> 00:08:52,760
So this is an app, it’s sold as like a family safety type of deal or something where they

114
00:08:52,760 –> 00:08:57,480
collect your location information pretty much all the time.

115
00:08:57,480 –> 00:09:02,240
And you know, if you go missing or something happens like that, then your family members

116
00:09:02,240 –> 00:09:06,760
or whomever can see where you are, what’s going on or something like that.

117
00:09:06,760 –> 00:09:12,280
But nobody really stopped and asked themselves, well, “How is this free?

118
00:09:12,280 –> 00:09:13,280
What’s in it for them?

119
00:09:13,280 –> 00:09:16,640
And what are they doing with my location information?”

120
00:09:16,640 –> 00:09:20,600
And you know, not surprising to us, it turns out that they were selling it.

121
00:09:20,600 –> 00:09:26,040
And just as some other quick examples, Fecesbook bought WhatsApp, which as we like to

122
00:09:26,040 –> 00:09:30,520
say, they didn’t buy WhatsApp, they bought the users of WhatsApp.

123
00:09:30,520 –> 00:09:34,680
That’s why they paid $19 billion for a free messaging app.

124
00:09:34,680 –> 00:09:37,320
They bought YOU and your data.

125
00:09:37,320 –> 00:09:43,400
Google bought Fitbit, which means that now they own all that Fitbit data that those devices

126
00:09:43,400 –> 00:09:46,440
have been collecting on their users for years.

127
00:09:46,440 –> 00:09:50,160
And that might not sound like a big deal to you, but there’s some very sensitive information

128
00:09:50,160 –> 00:09:51,160
in there.

129
00:09:51,160 –> 00:09:57,320
You know, this “fitness data” is really a proxy for health data, which if

130
00:09:57,320 –> 00:10:02,040
that data is in the hands of a doctor, that’s protected by HIPAA.

131
00:10:02,040 –> 00:10:07,440
But if it’s through an app or a service like Fitbit, it’s not protected and companies like

132
00:10:07,440 –> 00:10:10,800
Google can do whatever they want with that information.

133
00:10:10,800 –> 00:10:14,800
Amazon bought iRobot, which is an interesting development.

134
00:10:14,800 –> 00:10:21,720
I mean, iRobot has already had some privacy controversies of their own when people found

135
00:10:21,720 –> 00:10:27,560
out that they were uploading floor plans of people’s homes back to their servers.

136
00:10:27,560 –> 00:10:32,600
Now Amazon owns that information, they’re going to continue to collect that information

137
00:10:32,600 –> 00:10:36,400
and other data about what’s going on in your home.

138
00:10:36,400 –> 00:10:42,720
And I haven’t seen them myself, but I’ve heard that some of the Roomba models, either currently

139
00:10:42,720 –> 00:10:47,040
or ones that are coming out in the future, are going to have cameras on them.

140
00:10:47,040 –> 00:10:50,640
I’m going to quote an article that we’ve read about this.

141
00:10:50,640 –> 00:10:56,840
I wish that I had written down where it was from, but somebody said, “After Amazon’s $1.7

142
00:10:56,840 –> 00:11:03,080
billion acquisition of Roomba Maker iRobot, the beloved automated vacuums now belong

143
00:11:03,080 –> 00:11:09,400
to Jeff Bezos, who probably wasn’t being motivated by a passion for clean floors.”

144
00:11:09,400 –> 00:11:14,320
Now if you’re picking up what we and they are putting down, what we’re talking about

145
00:11:14,320 –> 00:11:15,320
is your data.

146
00:11:15,320 –> 00:11:18,880
They don’t care about the vacuums, they don’t care about your floors.

147
00:11:18,880 –> 00:11:20,760
They want your data.

148
00:11:20,760 –> 00:11:24,160
They want to know what’s going on in your home.

149
00:11:24,160 –> 00:11:30,400
Now just imagine for a second that now you’ve got this internet-connected vacuum that can

150
00:11:30,400 –> 00:11:36,600
map out your home and has cameras on it so they can see who and what is in your home.

151
00:11:36,600 –> 00:11:42,720
And let’s say you also have a Ring camera and an Alexa and whatever, you know, “smart”

152
00:11:42,720 –> 00:11:49,520
devices that they’ve got, just think about how much information this one company is collecting

153
00:11:49,520 –> 00:11:53,640
about you, the people in your home and what’s going on in your home.

154
00:11:53,640 –> 00:11:58,560
It’s extremely disturbing. And we’re not going to state that this is what’s going on.

155
00:11:58,560 –> 00:12:00,280
I just thought this was funny.

156
00:12:00,280 –> 00:12:07,400
I was literally watching Rick and Morty just the other day and I think it was the season

157
00:12:07,400 –> 00:12:13,560
six finale where the president of the United States and a bunch of his goons were in Rick’s

158
00:12:13,560 –> 00:12:19,240
home and, you know, Rick was mad about this and he said, “If I wanted the government in

159
00:12:19,240 –> 00:12:23,520
my home, I’d buy an Alexa”, which, you know, we thought was pretty funny.

160
00:12:23,520 –> 00:12:27,720
All right, so I’m coming back from the future on this one.

161
00:12:27,720 –> 00:12:32,640
I’m in the middle of the editing process and, you know, some of these things that we talk

162
00:12:32,640 –> 00:12:37,680
about are just coming from memory. And every once in a while, I’ll talk about something

163
00:12:37,680 –> 00:12:43,320
that I’ve wondered to myself, like, “Am I actually remembering that correctly?”

164
00:12:43,320 –> 00:12:48,440
And this whole deal with the Roomba’s with the cameras is one of those examples.

165
00:12:48,440 –> 00:12:54,760
So I stopped editing and looked it up and what I found was quite amusing.

166
00:12:54,760 –> 00:12:55,960
So get a kick out of this.

167
00:12:55,960 –> 00:13:03,920
I’m looking at an article on MIT Technology Review titled, “A Roomba recorded a woman on

168
00:13:03,920 –> 00:13:05,200
the toilet.

169
00:13:05,200 –> 00:13:08,520
How did screenshots end up on Facebook?”

170
00:13:08,520 –> 00:13:10,200
You can’t make this stuff up.

171
00:13:10,200 –> 00:13:14,720
All right, let me read a little bit from this article because it’s just too good to pass

172
00:13:14,720 –> 00:13:15,720
up on.

173
00:13:15,720 –> 00:13:23,000
It says, I quote, “In the fall of 2020, gig workers in Venezuela posted a series of images

174
00:13:23,000 –> 00:13:27,000
to online forums where they gathered to talk shop.

175
00:13:27,000 –> 00:13:34,080
The photos were mundane, if sometimes intimate, household scenes captured from low angles,

176
00:13:34,080 –> 00:13:37,400
including some you really wouldn’t want to share on the internet.

177
00:13:37,400 –> 00:13:43,520
In one particularly revealing shot, a young woman in a lavender t-shirt sits on the toilet,

178
00:13:43,520 –> 00:13:45,960
her shorts pulled down to mid-thigh.

179
00:13:45,960 –> 00:13:52,480
The images were not taken by a person, but development versions of iRobot’s Roomba J7

180
00:13:52,480 –> 00:13:54,600
series robot vacuum.

181
00:13:54,600 –> 00:14:00,920
They were sent to Scale AI, a startup that contracts workers around the world to label

182
00:14:00,920 –> 00:14:06,480
audio, photo, and video data used to train artificial intelligence (AI).”

183
00:14:06,480 –> 00:14:11,880
And I’m going to skip a few lines here, but then it says, “Furniture, decor, and objects

184
00:14:11,880 –> 00:14:18,440
located high on the walls and ceilings are outlined by rectangular boxes and accompanied

185
00:14:18,440 –> 00:14:24,120
by labels like “TV”, “plant_or_flower” and “ceiling light”.”

186
00:14:24,120 –> 00:14:29,880
Now in Amazon’s defense, they basically said that these were like development prototypes

187
00:14:29,880 –> 00:14:35,000
and that the people who had these had kind of signed up for some special program where

188
00:14:35,000 –> 00:14:39,200
they agreed to let it record this information and blah, blah, blah.

189
00:14:39,200 –> 00:14:42,960
But this is still pretty disturbing for a number of reasons.

190
00:14:42,960 –> 00:14:49,600
One: Does anybody really understand what they’re agreeing to either in this program or when

191
00:14:49,600 –> 00:14:52,880
you buy an actual Roomba off the shelf?

192
00:14:52,880 –> 00:14:58,400
Do their agreements not allow them to do what we’re talking about in this article here?

193
00:14:58,400 –> 00:15:03,680
You know, studies have shown that even attorneys have a really difficult time understanding

194
00:15:03,680 –> 00:15:09,080
what you’re agreeing to in terms and conditions from, you know, big companies like Amazon

195
00:15:09,080 –> 00:15:10,960
and Google and whomever.

196
00:15:10,960 –> 00:15:16,320
So how is anyone supposed to have any confidence that this isn’t going to be going on with

197
00:15:16,320 –> 00:15:20,760
the Roombas that they buy off the shelf or off of Amazon.com.

198
00:15:20,760 –> 00:15:25,920
And another thing that’s disturbing about this is the statement where they talked about

199
00:15:25,920 –> 00:15:32,120
the TV, the plants and the ceiling light and objects located high on the walls and ceilings

200
00:15:32,120 –> 00:15:33,720
being labeled.

201
00:15:33,720 –> 00:15:39,200
Now Amazon can explain this away all they want about how this was a development program.

202
00:15:39,200 –> 00:15:43,720
But what does the ceiling light have to do with cleaning my floors?

203
00:15:43,720 –> 00:15:48,240
And our response to that is, it’s not about cleaning your floors.

204
00:15:48,240 –> 00:15:49,760
That’s not what this is about.

205
00:15:49,760 –> 00:15:51,760
This is about your data.

206
00:15:51,760 –> 00:15:57,080
All right, back from the future again, because this is the gift that keeps on giving. They

207
00:15:57,080 –> 00:16:04,960
posted a follow-up article called “Roomba Testers Feel Misled After Intimate Images End Up on

208
00:16:04,960 –> 00:16:05,960
Facebook.”

209
00:16:05,960 –> 00:16:13,200
I’m going to read a few lines here, “When Greg unboxed a new Roomba robot vacuum

210
00:16:13,200 –> 00:16:18,960
cleaner in December 2019, he thought he knew what he was getting into.

211
00:16:18,960 –> 00:16:25,440
He would allow the preproduction test version of iRobot’s Roomba J series device to roam

212
00:16:25,440 –> 00:16:31,400
around his house, let it collect all sorts of data to help improve its artificial intelligence (AI)

213
00:16:31,400 –> 00:16:35,920
and provide feedback to iRobot about his user experience.

214
00:16:35,920 –> 00:16:38,280
He had done this all before.

215
00:16:38,280 –> 00:16:43,800
Outside of his day job as an engineer at a software company, Greg has been beta testing

216
00:16:43,800 –> 00:16:46,040
products for the past decade.

217
00:16:46,040 –> 00:16:52,560
But what Greg didn’t know and does not believe he consented to was that iRobot would share

218
00:16:52,560 –> 00:17:00,440
test users’ data in a sprawling global data supply chain where everything and every person

219
00:17:00,440 –> 00:17:06,600
captured by the device’s front-facing cameras could be seen and perhaps annotated by low

220
00:17:06,600 –> 00:17:13,080
paid contractors outside the United States who could screenshot and share images at their

221
00:17:13,080 –> 00:17:14,080
will.

222
00:17:14,080 –> 00:17:19,560
Greg, who asked that we identify him only by his first name because he signed a nondisclosure

223
00:17:19,560 –> 00:17:26,440
agreement with iRobot, is not the only test user who feels dismayed and betrayed.”

224
00:17:26,440 –> 00:17:31,400
And just to add a little bit more color to the whole Google buying Fitbit thing, this

225
00:17:31,400 –> 00:17:32,400
is very disturbing.

226
00:17:32,400 –> 00:17:36,440
This is something that you really need to be thinking about if you have a Fitbit.

227
00:17:36,440 –> 00:17:43,160
But in addition to Google being able to infer things about your health data, they can also

228
00:17:43,160 –> 00:17:49,920
see other highly-sensitive information like your location and other activities like your

229
00:17:49,920 –> 00:17:56,960
sleep, when you sleep, the quality of your sleep, who you’re sleeping with, your sexual

230
00:17:56,960 –> 00:17:59,280
activities, including affairs.

231
00:17:59,280 –> 00:18:04,280
You know, if you’re married, for example, and Google can use your Fitbit to see that

232
00:18:04,280 –> 00:18:10,040
you went to some other address at three in the morning and saw some heightened physical

233
00:18:10,040 –> 00:18:16,520
activity for a certain period of time, you know, you can see how that looks.

234
00:18:16,520 –> 00:18:21,160
And if you think I’m just being facetious about that, this data really does exist.

235
00:18:21,160 –> 00:18:26,960
And this is going on and this kind of data and the inferences from that data has been

236
00:18:26,960 –> 00:18:31,120
used in real court cases so you can go look that up.

237
00:18:31,120 –> 00:18:32,320
It’s quite interesting.

238
00:18:32,320 –> 00:18:40,000
All right, Consideration #2: Your account can be locked, deleted, banned at

239
00:18:40,000 –> 00:18:46,160
any time and for any reason. This can be done by an automated system.

240
00:18:46,160 –> 00:18:50,560
You know, if it doesn’t like what’s in your photos or something that you’re saying. This

241
00:18:50,560 –> 00:18:56,560
can be done by a rogue employee who doesn’t like you, what you’re doing for whatever reason.

242
00:18:56,560 –> 00:19:01,200
This can be done by an employee that was paid to do this.

243
00:19:01,200 –> 00:19:02,200
I’m not joking.

244
00:19:02,200 –> 00:19:09,240
If you go to the Wikipedia page for, I believe it was Instagram, there were Instagram employees

245
00:19:09,240 –> 00:19:14,560
that were caught being paid to delete people’s Instagram accounts.

246
00:19:14,560 –> 00:19:20,800
Now for some people, that’s their business or that’s what they use to back up their photos

247
00:19:20,800 –> 00:19:22,240
or their memories.

248
00:19:22,240 –> 00:19:23,240
Just keep that in mind.

249
00:19:23,240 –> 00:19:26,360
That can be taken away from you at any time.

250
00:19:26,360 –> 00:19:33,200
Services like Fecesbook and Instagram are also liable to unexpectedly demand that you show

251
00:19:33,200 –> 00:19:35,920
them a government ID to get back into your account.

252
00:19:35,920 –> 00:19:39,800
We’ve read many cases of people complaining about that.

253
00:19:39,800 –> 00:19:47,840
I actually know somebody who uses Fecesbook as a backup tool for her photos and memories.

254
00:19:47,840 –> 00:19:52,880
So if that sounds like you, please seriously considered coming up with something that’s

255
00:19:52,880 –> 00:19:56,160
more private and secure that you actually control.

256
00:19:56,160 –> 00:20:02,240
I mean, not only can that be taken away from you at any moment for any reason, you also need

257
00:20:02,240 –> 00:20:08,680
to keep in mind that a lot of services like Fecesbook will compress and modify your photos

258
00:20:08,680 –> 00:20:12,240
and videos so that it takes up less space on their servers.

259
00:20:12,240 –> 00:20:15,720
Because let’s face it, they don’t really want to store your photos.

260
00:20:15,720 –> 00:20:20,280
They just want all of the data that comes along with them, like where was it taken?

261
00:20:20,280 –> 00:20:22,440
What kind of device was it taken on?

262
00:20:22,440 –> 00:20:23,440
Who’s in it?

263
00:20:23,440 –> 00:20:24,440
What’s going on?

264
00:20:24,440 –> 00:20:25,440
Things like that.

265
00:20:25,440 –> 00:20:27,520
They don’t want your photos and videos.

266
00:20:27,520 –> 00:20:34,520
We’ve also been made aware of a little scam going around, particularly with Fecesbook where

267
00:20:34,520 –> 00:20:42,160
people are hacking into people’s accounts and posting various kinds of forbidden materials

268
00:20:42,160 –> 00:20:46,120
and getting these accounts banned immediately.

269
00:20:46,120 –> 00:20:47,320
And you know, that’s permanent.

270
00:20:47,320 –> 00:20:52,480
If someone can get into your Fecesbook or Instagram account and start putting some of this material

271
00:20:52,480 –> 00:20:57,840
on there, it’s not just your account that gets banned, it’s YOU that gets banned.

272
00:20:57,840 –> 00:21:00,440
You’re not allowed to make an account ever again.

273
00:21:00,440 –> 00:21:03,600
And obviously when that happens, there goes all of your data.

274
00:21:03,600 –> 00:21:09,080
They’re not going to let you sign in and download your data, at least not as far as we’re aware.

275
00:21:09,080 –> 00:21:14,280
And we’re going to do an entire episode on this, but there’s a story, you might have heard

276
00:21:14,280 –> 00:21:15,280
about it.

277
00:21:15,280 –> 00:21:23,480
But there is a man in, I believe San Francisco, whose Google account got permanently disabled

278
00:21:23,480 –> 00:21:31,000
because he took a nude image of his son for medical reasons and sent it to his doctor

279
00:21:31,000 –> 00:21:33,720
so that he could be diagnosed.

280
00:21:33,720 –> 00:21:40,640
And Google’s automated systems identified this, disabled his account, and forwarded this information

281
00:21:40,640 –> 00:21:42,800
to law enforcement.

282
00:21:42,800 –> 00:21:47,760
And just think about for you, what that would mean if you have a Google account, what would

283
00:21:47,760 –> 00:21:50,720
happen if your account got banned like that?

284
00:21:50,720 –> 00:21:57,880
That might mean losing all of your emails, contacts, Google docs, calendar information.

285
00:21:57,880 –> 00:22:01,000
And in this guy’s case, he lost his internet service.

286
00:22:01,000 –> 00:22:03,600
I mean, it was just a complete disaster.

287
00:22:03,600 –> 00:22:11,200
All right, Consideration #3: Policies can change at any time without notice.

288
00:22:11,200 –> 00:22:16,440
So any day you might sign into one of your accounts, whatever that might be, and they

289
00:22:16,440 –> 00:22:21,120
might have a little pop up that says, “Hey, we’ve changed our terms, you need to do X,

290
00:22:21,120 –> 00:22:26,200
Y or Z or agree to A, B and C, or you can’t get into your account, you can’t use our*

291
00:22:26,200 –> 00:22:27,920
service anymore.”

292
00:22:27,920 –> 00:22:34,320
Now just for fun, I have no idea why I did this, but I was just randomly searching for

293
00:22:34,320 –> 00:22:37,320
an old service called Photobucket.

294
00:22:37,320 –> 00:22:42,880
Now if you’re like me and you’re old as dirt, you’ll remember that Photobucket was like

295
00:22:42,880 –> 00:22:48,960
the go-to place to store your photos like 10 or 20 years ago or however long it’s been

296
00:22:48,960 –> 00:22:50,360
at this point.

297
00:22:50,360 –> 00:22:53,960
And I looked it up because I was wondering if it still existed.

298
00:22:53,960 –> 00:23:01,040
And apparently it does, and recently it made the news because they made a policy change.

299
00:23:01,040 –> 00:23:06,640
When people logged into Photobucket one day, there’s a little pop up that said, Hey, you

300
00:23:06,640 –> 00:23:08,600
have to pay now.

301
00:23:08,600 –> 00:23:15,120
I think it was some outrageous amount like $399 or pounds or whatever it was in this

302
00:23:15,120 –> 00:23:20,360
photo that I saw that you need to pay this yearly subscription or whatever to get into

303
00:23:20,360 –> 00:23:22,920
your account and continue using it.

304
00:23:22,920 –> 00:23:27,800
And there’s been so many people on the internet in forums and whatnot complaining like, “I’ve

305
00:23:27,800 –> 00:23:32,360
been storing my photos here for over 10 years and now it’s all gone.”

306
00:23:32,360 –> 00:23:38,320
And you’ll go to these forums like car repair forums and stuff and all of these images are

307
00:23:38,320 –> 00:23:43,320
missing all over the internet because people were storing them in Photobucket.

308
00:23:43,320 –> 00:23:49,600
And now they’re not available anymore because who wants to pay Photobucket $400 to store

309
00:23:49,600 –> 00:23:50,600
their photos?

310
00:23:50,600 –> 00:23:51,600
It’s insane.

311
00:23:51,600 –> 00:23:57,360
Now, to their credit, I think what they did was made it sound like you wouldn’t even be

312
00:23:57,360 –> 00:24:00,120
able to get into download your photos unless you paid.

313
00:24:00,120 –> 00:24:05,640
But I think I read somewhere that like there actually was some like hidden way to get in

314
00:24:05,640 –> 00:24:09,040
and download your photos before they deleted your account.

315
00:24:09,040 –> 00:24:11,120
But still, that’s not really the point.

316
00:24:11,120 –> 00:24:16,800
The point of this Consideration is to just keep in mind that if you don’t physically

317
00:24:16,800 –> 00:24:23,640
control your account and your data, it can be taken away from you at any time.

318
00:24:23,640 –> 00:24:30,080
Consideration #4: If an app or service can ID you, you should assume that that is

319
00:24:30,080 –> 00:24:31,080
permanent.

320
00:24:31,080 –> 00:24:37,600
Now, even though I know this, I will sometimes go into accounts that I own and change my

321
00:24:37,600 –> 00:24:42,600
name to like Elon Musk or whomever and change things around.

322
00:24:42,600 –> 00:24:48,320
If anything, so that if there’s a data breach, you know, my real information might not get

323
00:24:48,320 –> 00:24:54,800
exposed, but I don’t have any delusions that by doing that, I’m actually hiding from this

324
00:24:54,800 –> 00:24:57,520
company who I really am.

325
00:24:57,520 –> 00:25:01,560
So just keep this in mind anytime you use a new app or service or you’re going to open

326
00:25:01,560 –> 00:25:07,480
an account somewhere, if they can get your name, IP address, phone number, device and

327
00:25:07,480 –> 00:25:15,240
browser fingerprints, IMEI, IMSI, MAC address, whatever, just assume that they are never

328
00:25:15,240 –> 00:25:19,320
going to give that up, no matter if you try to change it or if you delete your account

329
00:25:19,320 –> 00:25:22,880
or whatever, just assume that that is permanent.

330
00:25:22,880 –> 00:25:28,880
And along with what we were saying earlier about avoiding a data silo mentality, just

331
00:25:28,880 –> 00:25:34,720
be aware that when you open an account somewhere, if they can ID you, chances are they’re going

332
00:25:34,720 –> 00:25:39,160
to start linking your other accounts together and sharing data.

333
00:25:39,160 –> 00:25:44,680
So like what I was saying before, we found out that Trello shares data with Facebook

334
00:25:44,680 –> 00:25:49,280
or at least they used to, I don’t know if they’re still doing that. And I have a sneaking

335
00:25:49,280 –> 00:25:52,320
suspicion that I know how they’re doing that.

336
00:25:52,320 –> 00:25:58,840
So I remember years ago, Trello came out and said, Hey, we’ve got two factor authentication (2FA)

337
00:25:58,840 –> 00:25:59,840
now.

338
00:25:59,840 –> 00:26:03,360
And I thought to myself, “Oh, that’s awesome, I’m going to turn that on.

339
00:26:03,360 –> 00:26:04,960
What are my options?”

340
00:26:04,960 –> 00:26:05,960
There’s one option.

341
00:26:05,960 –> 00:26:07,640
And guess what it is?

342
00:26:07,640 –> 00:26:08,640
SMS.

343
00:26:08,640 –> 00:26:15,120
Now, if you know anything about privacy, you’ll know that a phone number might seem kind

344
00:26:15,120 –> 00:26:19,840
of harmless, but what your phone number really is, it’s basically your Social Security Number (SSN)

345
00:26:19,840 –> 00:26:21,320
for the internet.

346
00:26:21,320 –> 00:26:27,440
I’ve actually heard multiple times that your phone number is actually more valuable on

347
00:26:27,440 –> 00:26:32,800
the dark web than your social security number is because it’s so powerful and it links your

348
00:26:32,800 –> 00:26:36,760
identity to so many of the accounts that you use.

349
00:26:36,760 –> 00:26:41,480
And I have a sneaking suspicion that what Trello was doing was taking people’s phone

350
00:26:41,480 –> 00:26:46,800
numbers that they were using for SMS two-factor authentication (2FA) and using that to link to their

351
00:26:46,800 –> 00:26:48,800
Facebook accounts.

352
00:26:48,800 –> 00:26:54,440
And I’m also pretty sure that Twitter recently got sued in a class action lawsuit for doing

353
00:26:54,440 –> 00:26:56,000
the same thing.

354
00:26:56,000 –> 00:27:00,560
Because let’s face it, when someone’s trying to secure their account with two factor authentication,

355
00:27:00,560 –> 00:27:04,400
they’re probably not thinking to themselves, “Oh, well, this information is going to be

356
00:27:04,400 –> 00:27:09,080
used to identify me and start linking all my accounts together and sharing my private

357
00:27:09,080 –> 00:27:11,840
sensitive information with third parties.”

358
00:27:11,840 –> 00:27:16,320
But you know, unfortunately, that’s what’s going on with a lot of services.

359
00:27:16,320 –> 00:27:23,360
So again, just to be very clear, if you’re going to give your identifying information,

360
00:27:23,360 –> 00:27:28,760
like a phone number, for example, just assume that they’re going to use that and abuse that

361
00:27:28,760 –> 00:27:31,120
and you’re never going to be able to take that back.

362
00:27:31,120 –> 00:27:34,640
Even if you change your phone number to something else, they’re always going to remember what

363
00:27:34,640 –> 00:27:36,360
your original one was.

364
00:27:36,360 –> 00:27:40,320
They’re always going to remember who you are specifically.

365
00:27:40,320 –> 00:27:43,440
There’s no taking that back.

366
00:27:43,440 –> 00:27:45,520
Consideration #5:

367
00:27:45,520 –> 00:27:51,960
Many services make it difficult or impossible to change certain information.

368
00:27:51,960 –> 00:27:58,440
We see this being a problem mostly with usernames, names, and birthdays.

369
00:27:58,440 –> 00:28:03,040
Now from the company’s perspective, I can kind of understand them not wanting to change

370
00:28:03,040 –> 00:28:07,600
your birthday because I mean, as far as they’re concerned, you should have given them your

371
00:28:07,600 –> 00:28:10,720
real birthday and obviously that doesn’t change.

372
00:28:10,720 –> 00:28:15,920
But you know, from our perspective, the user, we might want to change that because that

373
00:28:15,920 –> 00:28:19,640
information is probably going to end up in a data breach at some point.

374
00:28:19,640 –> 00:28:23,920
But the whole username thing really grinds my gears because, you know, if you’re not

375
00:28:23,920 –> 00:28:28,840
really thinking about it or you’re just getting lazy and you’re using the same username for

376
00:28:28,840 –> 00:28:33,800
a bunch of different systems, just be aware that they might be communicating with each

377
00:28:33,800 –> 00:28:37,400
other so they can link your identity and your accounts together.

378
00:28:37,400 –> 00:28:43,000
But also you need to understand that there are many websites out there where you can

379
00:28:43,000 –> 00:28:48,360
just type in somebody’s username and it’ll pull up a bunch of information about them.

380
00:28:48,360 –> 00:28:53,520
Some of them are just as simple as showing what other systems out there like Fecesbook

381
00:28:53,520 –> 00:29:00,160
and Twitter and Medium, whatever the heck that is, have that username registered.

382
00:29:00,160 –> 00:29:04,560
But you know, if somebody is trying to attack you for one reason or another, like if they’re

383
00:29:04,560 –> 00:29:09,000
just trying to mess with you or they’re trying to steal your identity or get into your accounts,

384
00:29:09,000 –> 00:29:11,400
a tool like this is very useful.

385
00:29:11,400 –> 00:29:15,040
So at the end of this episode, one of the things that you’re going to want to consider

386
00:29:15,040 –> 00:29:20,560
doing is going to these services where you have the same username all over the place

387
00:29:20,560 –> 00:29:26,080
and think about changing some of them to something random so that at least average people can’t

388
00:29:26,080 –> 00:29:29,120
just link all of your accounts together.

389
00:29:29,120 –> 00:29:33,440
And email and shipping address information can also be problematic.

390
00:29:33,440 –> 00:29:36,480
This isn’t just a username or a birthday problem.

391
00:29:36,480 –> 00:29:44,000
A lot of services, especially like shopping websites, they won’t let you like delete your

392
00:29:44,000 –> 00:29:49,320
email address or your shipping address or sometimes like a credit card or something.

393
00:29:49,320 –> 00:29:54,960
For the email, what I typically do is I give them like an email alias from SimpleLogin

394
00:29:54,960 –> 00:29:59,800
or AnonAddy or something like that and the shipping address is I’ll just completely make

395
00:29:59,800 –> 00:30:00,800
something up.

396
00:30:00,800 –> 00:30:01,800
All right.

397
00:30:01,800 –> 00:30:06,720
So let me give you some examples of this from our experience helping our clients.

398
00:30:06,720 –> 00:30:11,280
American Express and Spotify do not let you change your username.

399
00:30:11,280 –> 00:30:16,480
Sears, if they even still exist, last time I checked, they don’t let you change your

400
00:30:16,480 –> 00:30:19,680
name without contacting customer service.

401
00:30:19,680 –> 00:30:24,640
Fecesbook does this thing where if you want to close your account and if you ever want

402
00:30:24,640 –> 00:30:30,600
to reopen it again for whatever reason, you need to show them a photo ID. And they also

403
00:30:30,600 –> 00:30:33,000
have rules for changing your birthday.

404
00:30:33,000 –> 00:30:38,000
I’m pretty sure they ask you like, could you show them proof that that’s actually your

405
00:30:38,000 –> 00:30:43,080
birthday if they asked it or they have some rule like if you change it more than two or

406
00:30:43,080 –> 00:30:47,480
three times or something that they take some action against you. It’s annoying, but just

407
00:30:47,480 –> 00:30:53,480
keep this in mind anytime you open an account, just be aware that it might be difficult or

408
00:30:53,480 –> 00:30:57,240
impossible to change some of the information that you give them.

409
00:30:57,240 –> 00:31:03,320
This problem is so bad that a lot of these companies will fight you tooth and nail to

410
00:31:03,320 –> 00:31:06,520
not let you change or delete some of your data.

411
00:31:06,520 –> 00:31:13,160
I literally went back and forth with a company for weeks, just for the sake of, I don’t

412
00:31:13,160 –> 00:31:18,280
know, research, I guess, trying to get them to delete my account, they just wouldn’t do

413
00:31:18,280 –> 00:31:19,280
it.

414
00:31:19,280 –> 00:31:23,080
I’m just thinking to myself like, wouldn’t it be cheaper and easier for them to just

415
00:31:23,080 –> 00:31:28,080
delete it rather than argue with me back and forth for weeks?

416
00:31:28,080 –> 00:31:35,760
And we’ve also learned that even a CCPA request might not delete a lot of your information.

417
00:31:35,760 –> 00:31:42,680
I’ve seen a response from a CCPA request where they’ll say things like, “Okay, we fulfilled

418
00:31:42,680 –> 00:31:49,040
your request, but just be aware that we’re required to keep like your financial information

419
00:31:49,040 –> 00:31:50,520
and you purchase history…”

420
00:31:50,520 –> 00:31:57,200
They listed like 12 things and I was thinking to myself, “Well, what did you actually delete?”

421
00:31:57,200 –> 00:31:59,200
Like this is ridiculous.

422
00:31:59,200 –> 00:32:06,440
All right, the sixth and final Consideration for part one of this episode is that you also

423
00:32:06,440 –> 00:32:10,880
need to be careful about what information you provide because even if they allow you

424
00:32:10,880 –> 00:32:17,400
to change it, some services might take very adverse actions against you for changing your

425
00:32:17,400 –> 00:32:18,960
information.

426
00:32:18,960 –> 00:32:26,640
So what’s going on there is a lot of services are really paranoid about security incidents.

427
00:32:26,640 –> 00:32:30,400
They don’t understand security and they don’t really know what to do with it.

428
00:32:30,400 –> 00:32:37,280
So they have these automated systems set in place that just look for anything that looks

429
00:32:37,280 –> 00:32:41,580
remotely unusual and just smash it with a hammer.

430
00:32:41,580 –> 00:32:48,280
So I went through this exercise a couple of years ago where I decided to do an accounting

431
00:32:48,280 –> 00:32:53,960
of what accounts I have, identify which ones I thought were kind of weak and that I should

432
00:32:53,960 –> 00:32:59,800
beef up a little bit and I went through all of them and, you know, like changed my email

433
00:32:59,800 –> 00:33:05,960
address to an alias and beefed up the password to something that’s like 30 random characters

434
00:33:05,960 –> 00:33:07,680
and stuff like that.

435
00:33:07,680 –> 00:33:14,560
And some services really put users in a box because they see that information being changed

436
00:33:14,560 –> 00:33:18,720
as being really suspicious for some reason because they think it might be an account

437
00:33:18,720 –> 00:33:20,440
takeover or something.

438
00:33:20,440 –> 00:33:22,480
I’ll give you a perfect example.

439
00:33:22,480 –> 00:33:26,280
So I went into an old eBay account that I have.

440
00:33:26,280 –> 00:33:30,640
I don’t really use it, but I just thought to myself, “Well, you know, I want it to have

441
00:33:30,640 –> 00:33:31,640
good security.

442
00:33:31,640 –> 00:33:35,800
I don’t I obviously don’t want anyone else to try to get into it.”

443
00:33:35,800 –> 00:33:41,240
So I went in there, you know, from a new session, obviously, I don’t really keep sessions because

444
00:33:41,240 –> 00:33:44,440
that’s a security risk and a privacy risk.

445
00:33:44,440 –> 00:33:48,880
And I was using a VPN because I don’t want eBay to know what my real IP address is.

446
00:33:48,880 –> 00:33:50,440
Why would I want them to have that?

447
00:33:50,440 –> 00:33:53,760
And I don’t want my ISP knowing what I’m doing on the internet.

448
00:33:53,760 –> 00:33:57,040
That’s a very normal and reasonable thing to do.

449
00:33:57,040 –> 00:34:02,960
Well, anyway, I went in there, signed in, changed my email address and my password to

450
00:34:02,960 –> 00:34:05,240
secure my account.

451
00:34:05,240 –> 00:34:10,640
And sometime later, I think it was like, I don’t know, like an hour or two, they sent

452
00:34:10,640 –> 00:34:15,040
me an email to the new email address, which is kind of weird.

453
00:34:15,040 –> 00:34:19,760
And they said, like, Hey, we’ve detected, you know, I’m just paraphrasing here.

454
00:34:19,760 –> 00:34:23,560
But they’re like, you know, we’ve detected some suspicious activity.

455
00:34:23,560 –> 00:34:25,320
So we went ahead and disabled your account.

456
00:34:25,320 –> 00:34:29,880
And if you want to get back into it, you need to contact customer support.

457
00:34:29,880 –> 00:34:34,760
So right off the bat, I’m extremely irritated because, you know, I just spent my time going

458
00:34:34,760 –> 00:34:37,120
in there to secure my account.

459
00:34:37,120 –> 00:34:40,560
Now they’re telling me that they locked my account to make sure that it’s secure.

460
00:34:40,560 –> 00:34:45,040
And they’re asking me to help them secure it, which is what I just did.

461
00:34:45,040 –> 00:34:47,240
So I’m extremely irritated about it.

462
00:34:47,240 –> 00:34:52,560
And on top of that, their customer support is practically non-existent.

463
00:34:52,560 –> 00:34:58,280
So I did some digging around to try to find a phone number, and then I called it and there’s

464
00:34:58,280 –> 00:35:03,440
like this automated system where it’s like, Oh, you need a PIN number. Like you can’t

465
00:35:03,440 –> 00:35:04,440
just talk to us.

466
00:35:04,440 –> 00:35:05,800
You need to give us a PIN number first.

467
00:35:05,800 –> 00:35:07,360
And I’m like, “What PIN number?

468
00:35:07,360 –> 00:35:09,320
No one ever gave me a PIN number.

469
00:35:09,320 –> 00:35:10,320
How do I get a PIN?”

470
00:35:10,320 –> 00:35:11,920
Like this is insane.

471
00:35:11,920 –> 00:35:18,320
So then I found some like support chat where they ask you a bunch of questions about who

472
00:35:18,320 –> 00:35:22,520
you are and like what your name and address is and stuff like that.

473
00:35:22,520 –> 00:35:27,280
And the first time this happened to me, I was actually able to get them to let me back

474
00:35:27,280 –> 00:35:31,200
into my account, but I had a problem.

475
00:35:31,200 –> 00:35:37,800
So the first time that I changed my email address, I did it to a bridge email address

476
00:35:37,800 –> 00:35:44,000
so that I could prevent my original email provider from seeing what I changed it to.

477
00:35:44,000 –> 00:35:45,000
That’s a little bit complicated.

478
00:35:45,000 –> 00:35:49,160
We’ll go into detail about that in a future episode, but basically when you change an

479
00:35:49,160 –> 00:35:54,720
email address, a service will often email your new one and your old one.

480
00:35:54,720 –> 00:35:59,160
So if you’re trying to, you know, run away from your old email provider, you don’t want

481
00:35:59,160 –> 00:36:01,800
them to see what your new email address is.

482
00:36:01,800 –> 00:36:06,320
So I used a bridge email address with eBay, and I was going to go back in and change it

483
00:36:06,320 –> 00:36:08,840
to my final one later.

484
00:36:08,840 –> 00:36:13,320
So when I did that the second time, they locked my account again!

485
00:36:13,320 –> 00:36:18,720
And this time I got ahold of their support on the chat system and I answered the same

486
00:36:18,720 –> 00:36:22,680
questions and they just said, sorry, we can’t help you.

487
00:36:22,680 –> 00:36:25,520
So my account has just been locked for years.

488
00:36:25,520 –> 00:36:27,680
I can’t get into my eBay account.

489
00:36:27,680 –> 00:36:32,400
I’ve spent hours trying to figure out what to do about it and who to contact.

490
00:36:32,400 –> 00:36:34,760
And it’s, it’s just insane.

491
00:36:34,760 –> 00:36:39,520
And this time I actually talked to the person a little bit longer, like, “Why do you guys

492
00:36:39,520 –> 00:36:40,520
do this?

493
00:36:40,520 –> 00:36:41,520
What’s going on?

494
00:36:41,520 –> 00:36:44,120
Like how can somebody prevent this from happening?”

495
00:36:44,120 –> 00:36:49,360
And you know, naturally they couldn’t tell me why they did this or, or what they’re looking

496
00:36:49,360 –> 00:36:50,440
for.

497
00:36:50,440 –> 00:36:57,200
But the person basically said that I should keep a session open all the time, you know,

498
00:36:57,200 –> 00:37:02,000
not clear out my session, which is what I do for privacy and security reasons, and to

499
00:37:02,000 –> 00:37:03,680
not use a VPN.

500
00:37:03,680 –> 00:37:09,440
So I basically told them, I said, “You’re trying to force me to implement poor privacy and

501
00:37:09,440 –> 00:37:13,880
security practices to help keep my eBay account secure.

502
00:37:13,880 –> 00:37:16,720
Like that doesn’t, that doesn’t make any sense.”

503
00:37:16,720 –> 00:37:21,660
So not surprisingly, we don’t use eBay anymore and we don’t recommend that anybody else do

504
00:37:21,660 –> 00:37:25,780
it either because this is not the way to treat your customers.

505
00:37:25,780 –> 00:37:30,440
But anyway, the point of this is that you need to be very careful about what accounts

506
00:37:30,440 –> 00:37:35,600
you open and what information you give them from the very beginning.

507
00:37:35,600 –> 00:37:40,360
If you’re going to give them a bunch of information because you’re lazy or you’re low on time

508
00:37:40,360 –> 00:37:44,440
or whatever, and you’re thinking to yourself, “Oh, well, I’ll just go back in a later date

509
00:37:44,440 –> 00:37:45,440
and change it.”

510
00:37:45,440 –> 00:37:48,160
Well, we would say, “Think twice about that.”

511
00:37:48,160 –> 00:37:52,680
I mean, from one, they, like we said earlier, they might just not let you do that.

512
00:37:52,680 –> 00:37:59,160
And in this type of scenario, they might lock or freeze or disable or delete your account

513
00:37:59,160 –> 00:38:04,160
or whatever because they think it looks suspicious when you’re changing your information.

514
00:38:04,160 –> 00:38:06,840
It’s insane, but that’s just the way that it is.

515
00:38:06,840 –> 00:38:07,840
All right.

516
00:38:07,840 –> 00:38:10,680
So now let’s go over some action items.

517
00:38:10,680 –> 00:38:15,320
In addition to not using eBay, just keep in mind that we’re going to go over these in

518
00:38:15,320 –> 00:38:20,000
a little bit more detail in Part 2 when we summarize all this information.

519
00:38:20,000 –> 00:38:28,120
But for now, be very stingy about opening new accounts. And be very selective about what

520
00:38:28,120 –> 00:38:30,720
information you give out.

521
00:38:30,720 –> 00:38:35,840
If you do that, you’ll improve your privacy, you’ll reduce the amount of damage that can

522
00:38:35,840 –> 00:38:40,240
be done when a lot of these services inevitably get breached.

523
00:38:40,240 –> 00:38:45,800
And you’ll spare yourself a lot of headaches that we and our clients have experienced over

524
00:38:45,800 –> 00:38:49,880
the years dealing with hundreds of apps and services.

525
00:38:49,880 –> 00:38:54,960
If you’d like more help with this and other privacy and security concerns, consider becoming

526
00:38:54,960 –> 00:38:56,640
a Bigger Insights client.

527
00:38:56,640 –> 00:39:01,800
We help people like you live more private and secure lives in one-on-one sessions.

528
00:39:01,800 –> 00:39:06,720
If you’re interested in that, go to our website, biggerinsights.com, and fill out the short

529
00:39:06,720 –> 00:39:10,800
form at the bottom of the page so we can schedule your initial consultation.

530
00:39:10,800 –> 00:39:16,520
Otherwise, please consider sharing this podcast so that we can help others navigate this Black

531
00:39:16,520 –> 00:39:18,800
Mirror world that we’re living in.

532
00:39:18,800 –> 00:39:22,400
Also be sure to subscribe and stay tuned for Part 2.

533
00:39:22,400 –> 00:39:27,680
And after that, we’re going to be publishing an episode along these lines called “Finding,

534
00:39:27,680 –> 00:39:30,120
Sanitizing, and Closing Accounts.”

535
00:39:30,120 –> 00:39:32,040
So keep an eye out for that as well.

536
00:39:32,040 –> 00:39:34,920
All right, that’s everything for this episode.

537
00:39:34,920 –> 00:39:58,000
Stay safe out there and have a great rest of your day.

Support Us

We’re an ethical company that puts our community first. You won’t find us injecting targeted ads or trackers into our website, peddling sketchy products/services, or selling our visitors’ data to 3rd-parties. As a result, our visibility and resources are rather limited.

Please consider supporting us to help keep our mission going. There are several ways to make a difference – from cryptocurrency contributions to simply sharing our content. Every bit of support is greatly appreciated and helps us make the world a more private, secure, and prosperous place.

More Great Content

  • All
  • Finance
  • Privacy & Security
  • Technology
Finance - Budgeting - Financial Planning - Accounting - Asset Allocation - Taxable and Tax-favored Accounts - Cash Finance

Asset Location: Taxable vs. Tax-favored Accounts (401k, IRA, HSA)

Asset Location (AKA Asset Placement) is a strategy for organizing your assets in an optimal way that helps you meet your financial goals. In the previous episode, we focused on asset location strategies for reducing taxes and simplifying your tax return. In this episode, we focus on asset location considerations ...
Continue →
Security - Software - Email - Computer Screen Privacy & Security

Email is Insecure – Here’s How to Improve Email Security

Email was never designed to be private or secure, so not surprisingly, it is neither private, nor secure. In the previous episode, we explained the reasons why as well as the risks inherent to email. However, email is so prevalent that it is unfortunately a necessary evil. In this episode, ...
Continue →
Planning - Concepting - Whiteboard - Tax Planning Tips - Asset Location - Asset Placement Finance

Asset Location: Reducing Taxes & Simplifying Your Tax Return

Asset Location (AKA Asset Placement) is a strategy for organizing your assets in such a way as to reduce tax burden, simplify your tax return, and manage risk. We discuss our Asset Location strategies, which includes specifics about tax treatment for growth stocks, dividend stocks, taxable bonds, real estate investment ...
Continue →
Drake - Bad Choice-Good Choice - Linux vs Windows macOS ChromeOS Technology

Linux Doesn’t Suck – Here’s Why Even Normies Should Use It

Linux has long been viewed as a science fair project for nerds. We explain why Linux doesn’t suck and why it's now usable even for normies. Some of the items discussed: Issues with Windows, ease of use, performance (efficient use of resources), hardware support, application support, OS licensing, concerns about ...
Continue →
Email - Mobile Phone - Privacy and Security - Technology - Hands Privacy & Security

Email is Insecure – Stop Using it for Sensitive Communications

Email is the primary means of sending messages and documents for many people. Unfortunately, email was never designed to be private or secure. Over time, we’ve developed several tools and techniques to help make it more secure. But at the end of the day, no matter how uncomfortable it makes ...
Continue →
Woman Shopping - Holding Shopping Bags - Retail - Spending Money Finance

What Does it Mean to be Able to Afford Something?

Most everyone will agree that you shouldn’t buy things that you can’t afford, yet so many do. Why is that? It seems to us that one of the reasons for this is because many don’t know what it means to be able to afford something. Spoiler alert – it doesn’t ...
Continue →
Scroll to Top