PolymerLabs/loose-it

Name: loose-it

Owner: PolymerLabs

Description: Pre build Polymer bindings and property effects

Created: 2017-08-30 01:56:32.0

Updated: 2017-09-29 23:19:12.0

Pushed: 2017-09-29 21:38:01.0

Homepage: null

Size: 392

Language: TypeScript

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

loose-it

This repository contains loose-it, a tool to pre-built Polymer metadata. The tool can be integrated in the existing Polymer tooling build line. It relies on a custom version of Polymer that can work a minimized serialized representation of Polymer metadata.

:warning: :warning: Note: This tool is not ready for production :warning: :warning:

Integration in existing Polymer tooling pipeline

Integration requires direct interaction with the Polymer tooling pipeline using gulp. Example integration with Gulp is shown in custom-build.

Updates to the gulpFile are as follows:

Add the following import:

t looseIt = require('loose-it').PreBuildBindings;

Update the integration with the sources and dependencies streams.

et's start by getting your source files. These are all the
iles in your `src/` directory, or those that match your
olymer.json "sources"  property if you provided one.
sourcesStream = polymerProject.sources()
imilarly, you can get your dependencies seperately and perform
ny dependency-only optimizations here as well.
dependenciesStream = polymerProject.dependencies();

buildStream = mergeStream(sourcesStream, dependenciesStream)
// Apply the tool
.pipe(new looseIt(polymerProject.config))

.pipe(sourcesStreamSplitter.split())

.pipe(gulpif(/\.js$/, babili()))
.pipe(gulpif(/\.css$/, cssSlam()))
.pipe(gulpif(/\.html$/, cssSlam()))
.pipe(gulpif(/\.html$/, htmlMinify()))

// Remember, you need to rejoin any split inline code when you're done.
.pipe(sourcesStreamSplitter.rejoin())
.once('data', () => {
    console.log('Analyzing build dependencies...');
});

As you can see in the above snippet, instead of processing the streams separately, they have to be pre-emptively merged. The tool then hooks into this stream directly and analyzes it using the configuration of polymerProject. At last, the stream is split to apply modifications such as minification.

High-level implementation
  1. Read in all files.
  2. Analyze the files with polymer-analyzer
  3. Based on the analysis, obtain DFS traversal of HTML imports
    1. All files that were in the stream but not in the traversal, yield back in the stream
  4. Launch Chrome Headless using Puppeteer
  5. For all documents of the DFS traversal:
    1. Execute all scripts in the document
    2. For all defined elements in the document:
      1. Define dom-module in the browser
      2. Obtain metadata (bindings, property-effects) from browser
      3. Write binding metadata in front of JS ASTNode of element
    3. Serialize all AST?s in the document back into a file
  6. Yield potentially modified content from the file back in the stream

This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.