Hello, I recently moved my Shapeoko into a public makerspace.

Hello, I recently moved my Shapeoko into a public makerspace. The place does not have internet. I took a look at Synthos offline directions. has anyone authored a python or another language script to download all the scripts and remap?

No this has not been done yet. @jlauer and I have talked about this but time / cycles / innovation has been a struggle for now. It would be a major plus for setup’s like yours if someone were to tackle this.

This has been discussed a lot and the general conclusion is that it’s not worth the effort. Everything is moving to the cloud and CNC must as well. The benefits are just too great. Case in point, there’s work being done on a cloud slicer for 3D printing for ChiliPeppr. If you aren’t on the Internet, it wouldn’t work. There’s chatter for a cloud OpenCV machine vision service as well. There’s too many amazing things to add to ChiliPeppr and without being in the cloud, much of it is not possible. Of course the community can do anything they want with ChiliPeppr. So, I encourage you to work on a script if you want. It may not even be that hard to pull off.

Just a thought … since you only need to connect long enough to get ChiliPeppr running, could you use your phone as a mobile hotspot?

@jlauer You are right. The evolving version of ChiliPeppr will be on the cloud. But we also want to use current version offline can you please help on this?

Well, the part I’m working on right now is the Eagle BRD file import feature. So, I’d rather stay focused on that as it’s super hard. I think doing an offline version could be tackled by somebody new, since I think it’s a fairly straightforward task given it’s Javascript/HTML/CSS and there’s probably tools out there folks have written for crawling websites that could be used. I’ll leave this one up to the community.

I have tried site sucker and httrack to craw the site but they did not go beyond the main page. :frowning:

You need a crawler that will grab all the javascript dependencies and make them local.

Can you suggest a software for that? or how can I google it

Honestly, I just looked deeper. It’s not worth it. Too many of the add-on modules dynamically load, i.e. the new Eagle BRD import is so massive that ChiliPeppr doesn’t load it until you click the “Eagle” button. That means no crawler would have insight to what files are needed. It’s just not going to work. It would be easier for folks to just get Internet connectivity. I realize both are hard, but it’s just about impossible to make an offline version of ChiliPeppr. Sorry.

you can easily zip the hosting directory and share with us (if Chilipeppr is open source) we can run it on MAMP locally

It’s all on JSFiddle. There’s no hosting directory. It’s all in the cloud. ChiliPeppr is open source, so you can absolutely pull the source and try to do an offline version. ChiliPeppr pulls Javascript as a mashup from all over the Internet. Each widget loads other widgets which load other widgets.

BTW, here’s an attempt at creating an offline version of ChiliPeppr from a macro. Just run it inside the Macro feature in CP and it’ll spit content out to console.log(). Remember you have to turn on “Debug Console Output” in the menu to the right of the login in upper right corner to get console to work inside ChiliPeppr. This script tries to retrieve each Javascript src file and inline it to create one massive monolithic HTML page. Then the theory is you could save out the final HTML to a local file and it’ll run with no external dependencies. The problem is as I stated earlier. If modules are dynamic then this approach wouldn’t work out of the box. If you manually clicked on all the plug-ins to force them to load, then this approach may work.

// Create offline version
// Clone the entire DOM
var htmlEl = $(‘html’).clone();
htmlEl.find(‘script’).each(function(i, elem) {
console.log(“script:”, i, “elem:”, elem);

if (elem.src) {
	// inline
	console.log("going to retrieve javascript url:", elem.src);
	// use chilipeppr's url retrieval method to solve cross-domain problems
	var url = "http://chilipeppr.com/geturl?url=" + elem.src;
	console.log("url we'll retrieve:", url);

        	url: url,
	success: function(responseText) {
		console.log("swapping in javascript. length:", responseText.length, "url:", url, "elem:", elem, "args:", arguments);
		//$('<script type="text/javascript">
		var el = htmlEl.find(elem);
		console.log("el:", el);



setTimeout(function() {
}, 10000);


Many features will fail that rely on cloud calls. The webcam for instance relies on you being logged in so it knows what server you are hosting your webcam stream from. The JSCut integration relies on a cloud call as well for your login to see if you uploaded any files to CP via JSCut.

However, I can’t think of any cloud calls that are critical to basic functions. So in that respect you could half a sort of working version.

wow you are great @jlauer The consol log out all the js files. What is next. Will I go to crome menu and file save web page? How can I save this monolithic HTML page

There’s a macro sample already in ChiliPeppr in the Macro “book” icon pulldown menu called “Download Gcode” that shows an example script to take a string and trigger a file download of the contents of the string. Give that a go, but I think you have about 10 further steps to go before you have something that works, but not sure.