What happens when G+ shuts down? Will our community content be lost?

(Claudio Prezzi) #1

What happens when G+ shuts down? Will our community content be lost?

I am working hard on saving all our community content, so all the valuable know-how will NOT be lost!

First I made sure we have the content (read only) somewhere and later we can move to an alternative platform or extend the backup to a full community.

Here is the actual dump (synced every 15 Minutes): https://fablab-winti.firebaseapp.com/

There are still some problems I am working on, like some missing comments and all the pictures are still coming from http://googleusercontent.com (need to be grabbed).


(Eric Lien) #2

The formatting and speed looks really nice. Do you have details on how you did the backup. We did something similar for the Eustathios and HercuLien 3d printing group. But your layout seems to maintain a similar look and feel to G+ that I like.


(Dries Verbruggen) #3

I would love to know that too, looks great


(Claudio Prezzi) #4

I did not develop the code myself. I just used a tutorial by @Romain_Vialard (see https://docs.google.com/document/d/1UGhxaN5AiRXXL0Ki0DlVWLYJo_YYiEYhM2w1caRhljU/edit?usp=sharing).

His solution works in two stages:

  1. part is a google script that runs frequently (scheduled in firebase). This syncs the community content grabbed from G+ to a firebase realtime db.
  2. part is a Glitch app that creates the frontend, which is pushed to the firebase hosting for public access.

(Dries Verbruggen) #5

interesting@cprezzi , just followed that tutorial and have a working copy. But how much of that is actually extracted and how much is using Google+ parts (API’s etc). Is this fully autonomous?


(Claudio Prezzi) #6

@Unfold All post and comment data (text, date, owner…) is copied to the firebase db. Pictures and videos are only links to http://googleusercontent.com, which will be gone after shutdown. The frontend is not depending on any G+ ressources. HTML, JS and CSS comes from local ressources.

I am working on a tool that downloads all pictures to a public server and then changes the links in the db.


(Dries Verbruggen) #7

@cprezzi thanks, was really easy but no idea what I actually did :slight_smile:


(Anthony Bolgar) #8

@cprezzi Would you like a new home for the group? We have created a new Makers Forum to host different communities that are looking for a new home. Stephane Buisson and I have already started hosting the K40 community there, as well as the OX group R7 sub group, and thit is also the awesome.tech official site for support of the Gerbil controllers. We would be happy to have you guys along as well.

The forum is at http://forum.makerforums.info


(Claudio Prezzi) #9

@funinthefalls Thank you for the info. I will check it out when I have some time.
I also thought about adding a forum to the laserweb website, but a forum is much less appealing than the G+ community design (at least for the existing content).


(Anthony Bolgar) #10

We are using a hosted Discourse platform. It is a more modern way of doing a forum.


(Dries Verbruggen) #11

@cprezzi they way we’re implementing this with @Wikifactory_Communit is hybrid, forum style like Discourse & stream style like G+, you can switch. But WikiFactory is more for hardware so not sure this is ideal for you although Laserweb is essential part of hardware projects. I run it with my Fabcreator :slight_smile:


(Jorge Robles) #12

@cprezzi maybe you can convert pics to datauri blobs and store in firebase too, not very orthodox but avoids any other machine


(Romain Vialard) #13

@Jorge_Robles @cprezzi actually Firebase comes with a storage solution:

It’s free for up to 5gb of data, which should be enough to store most pictures linked to posts of a community.
I was waiting to see if photos would be deleted or not by Google but they confirm that they will be deleted, so better to export them along with other post data.


(Claudio Prezzi) #14

@Romain_Vialard I know, I am already working on a script that downloads all the pictures found in the firebase db urls (from your sync) and uploads them to the firebase cloud storage, then inserts the new link to a new-url field. When we have all pictures ready, we can modify the frontend to use the new-url.

I would be happy to share my stuff with you to make that migration as easy as possible.


(Romain Vialard) #15

@cprezzi great, thanks!


(Stephane BUISSON) #16

@cprezzi I am sure you are aware of this :

from G+ to owner (wait for early March):

"Google+ Communities
To download data for Communities where you’re an owner or moderator, select Google+ Communities. You will get:

Names and links to Google+ profiles of community owners, moderators, members, applicants, banned members, and invitees
Links to posts shared with the community
Community metadata, including community picture, community settings, content control settings, your role, and community categories
Important: Starting early March 2019, you will also be able to download additional details from public communities, including author, body, and photos for every community post."

I also sent you a private post.


(Claudio Prezzi) #17

@StephaneBUISSON Yes, I’m aware, but I don’t trust Google’s announcements anymore and I like to be safe :wink:


(Dries Verbruggen) #18

@cprezzi Same here, they can’t even set a specific day for the extra Takeout features. ‘Early’ March is just way too late.


(Michael K Johnson) #19

@cprezzi I’ll plan to archive it so that if you decide you’d like the content imported into http://forum.makerforums.info you’ll be able to do that. No pressure, just information in case you look at what’s happening there and would like to exist in a larger community with related content. I have the import process working very well now. :slight_smile:


(Claudio Prezzi) #20

@mcdanlj Yes, I think it makes sense to also migrate the laserweb community to makerforums.info and hope this will become the new home for most makers.