This is using Pug, Sass & CoffeeScript, not the “original” version of HTML, CSS & JavaScript so I don’t think that will work. I have integrated many CodePens to Glide before providing they use “original” HTML, CSS & JavaScript though, so that does work.
Ok yes it was just the first one i took ,
i’m not so expert i have to say ,
so even your explanation is learning so thanks,
but,
as far i understand
I have integrated many CodePens to Glide before providing they use “original” HTML, CSS & JavaScript though, so that does work.
Can i ask you to show me something ?
Have a nice weekend
ᐧ
https://codepen.io/derwinsadiwa/pen/obbjdx
Here’s the adapted version of that Codepen.
https://replit.com/@ThinhDinh/codepen-test#function.js
oh great
and how do you embed on Glide?
ᐧ
I used the result of that code in a webview component.
ok,
i suppose that,
would have been so great to see the platform enhanced so much
to use that directly and not by means of a webview which
with
Rich text editor is the only way to enrich visually the app
and going a little bit further with UI,
thanks for sharing
ps.
this pen doesn’t use html and css right ?
so basically we could achieve something more aesthetically pleasing , such as with
images?Right ?
It’s written with Babel, so I doubt it works.
Nevertheless, just try playing around with Replit/Github using those Codepens. You never know which one will actually work.
Very, very useful feature. But we have amazing and powerful Glide editor, but we still must do it like this Loom | Free Screen & Video Recording Software
Java, codepen, is this line code? Isnt it? I hope, in future, David giving us such useful instrument in a native!
Hi @ThinhDinh,
I don’t know anything about javascript, but I have a good friend (chatgpt) who helped me to build a basic scrapper (similar to importxml in google sheet); the code (below) uses a library “cheerio” (…) and I wonder if I can use it into Glide Table, instead of keep using Google Sheet?
Thanks in advance
function scrapeUrls() {
var sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName(“urlMetaData”);
// Change the sheet name if necessary
var urls = sheet.getRange(“A2:A”).getValues();
Logger.log(urls);
// Assumes the URLs are in column A, starting from row 2
for (var i = 0; i < urls.length; i++) {
var url = urls[i][0];
if (url) {
var data = getWebsiteData(url);
var title = data[0];
var keywords = data[1];
var description = data[2];
var logo = data[3];
sheet.getRange(i + 2, 2, 1, 4).setValues([[title, keywords, description, logo]]);
// Write the extracted information back to the sheet
}
}
}
function getWebsiteData(url) {
var response = UrlFetchApp.fetch(url);
// Make an HTTP request to the URL
var content = response.getContentText();
// Get the content of the response as a string
// Load cheerio library
var $ = Cheerio.load(content);
// Scrape title, keywords, description and logo using cheerio methods
var title = $(“title”).text();
var keywords = $(“meta[name=‘keywords’]”).attr(“content”);
var description = $(“meta[name=‘description’]”).attr(“content”);
var logo = $(“link[rel=‘icon’]”).attr(“href”);
// Scrape new meta tags using cheerio methods
var og_image = $(“meta[property=‘og:image’]”).attr(“content”);
var og_title = $(“meta[property=‘og:title’]”).attr(“content”);
var og_description = $(“meta[property=‘og:description’]”).attr(“content”);
var slogan = $(“meta[name=‘slogan’]”).attr(“content”);
var twitter_site = $(“meta[name=‘twitter:site’]”).attr(“content”);
var fb_app_id = $(“meta[property=‘fb:app_id’]”).attr(“content”);
var fb_admins = $(“meta[property=‘fb:admins’]”).attr(“content”);
Logger.log("Title: " + title);
Logger.log("Keywords: " + keywords);
Logger.log("Description: " + description);
Logger.log("Logo: " + logo);
// Log and return new meta tag values
Logger.log("OG Image: " + og_image);
Logger.log("OG Title: " + og_title);
Logger.log("OG Description: " + og_description);
Logger.log("Slogan: " + slogan);
Logger.log("Twitter Site: " + twitter_site);
Logger.log("FB App ID: " + fb_app_id);
Logger.log("FB Admins: " + fb_admins);
return [title, keywords, description, logo, og_image, og_title, og_description, slogan, twitter_site, twitter_creator, fb_app_id, fb_admins];
}
I wouldn’t recommend having a scraper inside Glide Tables. I don’t know if you need this information to be extracted “live” or not, but if you just need it once, then using JS will not be good for you. Every user device will have to run that piece of code, assuming it’s working, and running that for every row in your table won’t be good.
Ah ok, thank you for the advice.
Then I should run it into gSheet and move the result to a Glide Table via Make, isn’t it?
Thanks
You can do it in Glide Tables directly, if you run a call through Make and then use the API to write the result back to the correct row in Glide Tables.