Skip to content

[Snyk] Upgrade @microsoft/api-extractor from 7.7.11 to 7.13.2 #176

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

snyk-bot
Copy link

@snyk-bot snyk-bot commented Apr 9, 2021

Snyk has created this PR to upgrade @microsoft/api-extractor from 7.7.11 to 7.13.2.

merge advice
ℹ️ Keep your dependencies up-to-date. This makes it easier to fix existing vulnerabilities and to more quickly identify and fix newly disclosed vulnerabilities when they affect your project.


  • The recommended version is 61 versions ahead of your current version.
  • The recommended version was released a month ago, on 2021-03-04.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open upgrade PRs.

For more information:

🧐 View latest project report

🛠 Adjust upgrade PR settings

🔕 Ignore this dependency or unsubscribe from future upgrade PRs

Snyk has created this PR to upgrade @microsoft/api-extractor from 7.7.11 to 7.13.2.

See this package in npm:


See this project in Snyk:
https://app.snyk.io/org/kadirselcuk/project/fed70c8d-1e2a-4a10-9c7b-47e6d3ab59fa?utm_source=github&utm_medium=upgrade-pr
@mistaken-pull-closer
Copy link

Thanks for your submission.

It appears that you've created a pull request using one of our repository's branches. Since this is
almost always a mistake, we're going to go ahead and close this. If it was intentional, please
let us know what you were intending and we can see about reopening it.

Thanks again!

@pull-dog
Copy link

pull-dog bot commented Apr 9, 2021

*Ruff* 🐶 I wasn't able to find any Docker Compose files in your repository at any of the given paths in the pull-dog.json configuration file, or the default docker-compose.yml file 😩 Make sure the given paths are correct.

Files checked:

  • docker-compose.yml
What is this?

Pull Dog is a GitHub app that makes test environments for your pull requests using Docker, from a docker-compose.yml file you specify. It takes 19 seconds to set up (we counted!) and there's a free plan available.

Visit our website to learn more.

Commands
  • @pull-dog up to reprovision or provision the server.
  • @pull-dog down to delete the provisioned server.
Troubleshooting

Need help? Don't hesitate to file an issue in our repository

Configuration

{
  "isLazy": false,
  "dockerComposeYmlFilePaths": [
    "docker-compose.yml"
  ],
  "expiry": "00:00:00",
  "conversationMode": "singleComment"
}

Trace ID
0fd8c030-9987-11eb-9762-69c7169d5bfc

@mistaken-pull-closer mistaken-pull-closer bot added the invalid This doesn't seem right label Apr 9, 2021
@guardrails
Copy link

guardrails bot commented Apr 9, 2021

⚠️ We detected 1567 security issues in this pull request:
Mode: paranoid | Total findings: 1567 | Considered vulnerability: 0

Hard-Coded Secrets (3)


* // given URL http://user:password@example.com:8080/#/some/path?foo=bar&baz=xoxo

https://github.com/turkdevops/angular/blob/dce021386b884120c590078964ba263bdc40394e/.yarn/releases/yarn-1.22.5.js#L128456

More info on how to fix Hard-Coded Secrets in General and Javascript.


Insecure Use of Regular Expressions (138)


return /\.spec\.(.*\.)?js$/.test(path);

return /\.spec\.(.*\.)?js$/.test(path);

const regex = new RegExp(`^ {0,3}(${disallowedHeadings.join('|')}) +.*$`, 'mg');


var INLINE_LINK = /(\S+)(?:\s+([\s\S]+))?/;

const anyBlockMatcher = new RegExp('^' + createOpenMatcher(`(${plainBlocks.join('|')})`));

const bothMatcher = new RegExp(left + '|' + right, 'g' + flags.replace(/g/g, ''));

const leftMatcher = new RegExp(left, flags.replace(/g/g, ''));

LOGGED_TEXT = LOGGED_TEXT.replace(/\x1B\[([0-9]{1,3}(;[0-9]{1,2})?)?[mGK]/g, '');

this._githubTokenRegex = new RegExp(githubToken, 'g');

const TYPE_SCOPE_RE = /^(\w+)(?:\(([^)]+)\))?\:\s(.+)$/;

return data.commit.message.match(new RegExp(`(?:close[sd]?|fix(?:e[sd]?)|resolve[sd]?):? #${id}(?!\\d)`, 'i'));

return new RegExp(`(<a name="${escapedVersion}"></a>.*?)(?:<a name="|$)`, 's');

const nonExactDeps = deps.filter(([, version]) => !/^\d+\.\d+\.\d+(?:-\w+\.\d+)?$/.test(version));

return new RegExp(s.replace(/([.*+?^=!:${}()|[\]\/\\])/g, '\\$1'), 'g');




const origin = `${scheme}://pr${pr}-${shortSha9}.${host}`;

const origin = `${scheme}://pr${pr}-${sha9}.${host}`;

it('should return /foo/bar.js', async () => {

const origin = `${scheme}://pr${pr}-${sha9}.${host}`;

const origin = `${scheme}://pr${pr}-${shortSha9}.${host}`;

it('should accept SHAs with leading zeros (but not trim the zeros)', async () => {

const bodyRegex9 = new RegExp(`^PR: ${pr} | SHA: ${sha9} | File: /index\\.html$`);



const regexPrefix = `^BUILD: ${BUILD} \\| PR: ${PR} \\| SHA: ${SHA} \\| File:`;

const idxContentRegex = new RegExp(`${regexPrefix} \\/index\\.html$`);

const regexPrefix1 = `^PR: ${PR} \\| SHA: ${ALT_SHA} \\| File:`;

const idxContentRegex1 = new RegExp(`${regexPrefix1} \\/index\\.html$`);

const regexPrefix2 = `^BUILD: ${BUILD} \\| PR: ${PR} \\| SHA: ${SHA} \\| File:`;

const idxContentRegex2 = new RegExp(`${regexPrefix2} \\/index\\.html$`);

const regexPrefix = `^PR: ${PR} \\| SHA: ${SHA} \\| File:`;

const idxContentRegex = new RegExp(`${regexPrefix} \\/index\\.html$`);

const publicPrDir = h.getPrDir(PrNums.TRUST_CHECK_ACTIVE_TRUSTED_USER, true);

validate(control: AbstractControl): ValidationErrors | null {

.filter(spec => spec.positive)

.filter(spec => !spec.positive)

it('should match pattern "^(https?:/)?/.*"', () => {

/** Regex determining the scope of a commit if provided. */

return data.commit.message.match(

// based on the conventional-changelog version. They removed anchors in more recent versions.

releaseConfig.extractReleaseNotesPattern = version =>

// Strip ANSI escape codes from log outputs.

if (githubToken != null) {





exp: string|number, errors: string[], allowNegativeValues?: boolean): AnimateTimings {

export const ISO8601_DATE_REGEX =





// (e.g. only keep `@angular/common` from `@angular/common/http`).

function countOccurrences(haystack: string, needle: string): number {

const pattern = `^(.*)\\.${gen.extensionPrefix}\\.ts$`;



export function verifyUniqueFactory(output: string, type: string): boolean {

const INLINE_BACKTICK_STRING = /`(([\s\S]*?)(\$\{[^}]*?\})?)*?[^\\]`/;





output: string, functionNamePattern?: string, expectedCount?: number): boolean {

function parseRegExp(str: string|undefined): RegExp {



const maybeRef = ref ? `, ${ref}` : ``;

const maybeRef = ref ? `, ${ref}` : ``;

const setClassMetadataRegExp = (expectedType: string): RegExp =>


// in the chrome dev tools.

const WS_CHARS = ' \f\n\r\t\v\u1680\u180e\u2000-\u200a\u2028\u2029\u202f\u205f\u3000\ufeff';

const NO_WS_REGEXP = new RegExp(`[^${WS_CHARS}]`);

// lexer is replacing the &ngsp; pseudo-entity with NGSP_UNICODE


scopeSelector = scopeSelector.replace(lre, '\\[').replace(rre, '\\]');


const _cssColonHostRe = new RegExp('(' + _polyfillHost + _parenSuffix, 'gim');


expect(() => compileApp())

expect(() => compileApp())







it('should escape regexp', () => {

expect(new RegExp(escapeRegExp('b')).exec('abc')).toBeTruthy();

expect(new RegExp(escapeRegExp('b')).exec('adc')).toBeFalsy();

expect(new RegExp(escapeRegExp('a.b')).exec('a.b')).toBeTruthy();

const minLineIndent = Math.min(...matches.map(el => el.length));

if (typeof meta === 'string') {

message.indexOf(`:${subTemplateIndex}${MARKER}`) + 2 + subTemplateIndex.toString().length;

const PP_PLACEHOLDERS_REGEXP = /\[(.+??)\]|(\/?\*\d+:\d+)/g;

const PP_ICU_PLACEHOLDERS_REGEXP = /{([A-Z0-9_]+)}/g;

expect(() => initWithTemplate('<div [id]="unstableStringExpression"></div>'))

'<div id="Expressions: {{ a }} and {{ unstableStringExpression }}!"></div>'))

expect(() => initWithTemplate('<div [attr.id]="unstableStringExpression"></div>'))

'<div attr.id="Expressions: {{ a }} and {{ unstableStringExpression }}!"></div>'))

expect(() => initWithTemplate('<div [style.color]="unstableColorExpression"></div>'))

expect(() => initWithTemplate('<div [class.someClass]="unstableBooleanExpression"></div>'))

expect(() => initWithHostBindings({'[id]': 'unstableStringExpression'}))

expect(() => initWithHostBindings({'[style.color]': 'unstableColorExpression'}))

expect(() => initWithHostBindings({'[class.someClass]': 'unstableBooleanExpression'}))

expect(match).toBeDefined();

return element.innerHTML.replace(/\sng-reflect-\S*="[^"]*"/g, '')

const styles = document.querySelectorAll('head style');



resolveCompileAndCreateComponent(MyComp, template);


getSourcePositionForStack(stack: string, genFile: string): SourcePos {

html = html.replace(/\sng-reflect-\S*="[^"]*"/g, '')

TestBed.createComponent(type);


const EMAIL_REGEXP =

function replaceOnce(searchText: string, regex: RegExp, replaceText: string): OverwriteResult {

if (tail instanceof Text) {


query.forEach((value: string[], name: string) => {





include: includeUrls.map(spec => new RegExp(spec.regex)),

// Patterns in the config are regular expressions disguised as strings. Breathe life into them.

private prefix: string) {



toMatch: function(actual: any) {

const bazelWorkspaceManifestPathRegex =

More info on how to fix Insecure Use of Regular Expressions in Javascript and Typescript.


Insecure Use of Dangerous Function (49)

var exec = require('child_process').exec;

const {execSync} = require('child_process');

const {spawn} = require('child_process');

const matches = /<title>(.*)<\/title>/.exec(content);

const ngTokenMatch = /^[nN]g([A-Z]\w*)/.exec(token);




const match = /^\{@[^\s\}]+[^\}]*\}/.exec(value);

var child_process = require('child_process');

const child_process = require('child_process');

const child_process = require('child_process');

const {execSync} = require('child_process');

const spawnSync = require('child_process').spawnSync;

var childProcess = require('child_process');

const spawnSync = require('child_process').spawnSync;

const {spawnSync} = require('child_process');

// correspond to other context variables provided by PullApprove for conditions.

nock(this.repoApiUrl).get(`/contents//CHANGELOG.md`).query(p => p.ref === branch).reply(200, {

const packageJsonPath = join(getRepoBaseDir(), 'package.json');






function parsePlaceholders(str: string): Placeholder[] {

function parseMetaProperties(str: string): Record<string, string> {


// straightforward implementation:

const migrationCollectionPath = require.resolve('../migrations.json');

getter(name: string): GetterFn {

setter(name: string): SetterFn {

return o.${name}.apply(o, args);`;

// straightforward implementation:

it('should render hello world when not minified', withBody('<trigger></trigger>', () => {

it('should render hello world when debug minified', withBody('<trigger></trigger>', () => {

it('should render hello world when fully minified', withBody('<trigger></trigger>', () => {

it('should render template form', withBody('<app-root></app-root>', async () => {

withBody('<hello-world></hello-world>', () => {

withBody('<hello-world></hello-world>', () => {

withBody('<hello-world></hello-world>', () => {


withBody('<todo-app></todo-app>', async () => {

it('should render todo', withBody('<todo-app></todo-app>', async () => {


const chunkSize = 512;

const chunkSize = 512;


const path = require('path');

More info on how to fix Insecure Use of Dangerous Function in Javascript and Typescript.


Insecure File Management (829)

var engines = require(__dirname + '/../package.json').engines;

let output = fs.createWriteStream(zipFileName);

return require(filePath, 'utf-8').projectType || 'cli';

let json = require(configFileName, 'utf-8');

let content = fs.readFileSync(fileName, 'utf8');

zip.append(fs.readFileSync(this.examplesSystemjsConfig, 'utf8'), { name: 'src/systemjs.config.js' });

zip.append(fs.readFileSync(this.examplesSystemjsLoaderConfig, 'utf8'), { name: 'src/systemjs-angular-loader.js' });

let tsconfig = fs.readFileSync(this.exampleTsconfig, 'utf8');

if (fs.existsSync(examplePath)) {

fs.writeFileSync(path.resolve(examplePath, EXAMPLE_CONFIG_FILENAME), '');


if (fs.existsSync(gitignoreFilePath)) {

const gitignoreFile = fs.readFileSync(gitignoreFilePath, 'utf8');

const gitignore = ignore().add(fs.readFileSync(path.resolve(BOILERPLATE_BASE_PATH, '.gitignore'), 'utf8'));

if (!fs.existsSync(SHARED_NODE_MODULES_PATH)) {

fs.writeFileSync(outputFile, header);

if (argv.viewengine && fs.existsSync(appDir + '/aot/index.html')) {

fs.appendFileSync(outputFile, emsg);

fs.appendFileSync(outputFile, emsg);

fs.appendFileSync(outputFile, '++ AoT version ++\n');

if (fs.existsSync(appDir + '/' + copyFileCmd)) {

fs.appendFileSync(outputFile, `Passed: ${appDir}\n\n`);

fs.appendFileSync(outputFile, `Failed: ${appDir}\n\n`);

fs.appendFileSync(outputFile, log);

fs.appendFileSync(outputFile, output);


.filter(childPath => fs.statSync(childPath).isDirectory())

.filter(pkgJsonPath => fs.existsSync(pkgJsonPath));

return JSON.parse(fs.readFileSync(filePath, 'utf8'));

fs.writeFileSync(boilerplatePkgJsonPath, `${JSON.stringify(boilerplatePkgJson, null, 2)}\n`);

const packageConfigFile = fs.readFileSync(pathToPackageConfig, 'utf8');

fs.writeFileSync(pkg.packageJsonPath, JSON.stringify(tmpConfig, null, 2));

fs.writeFileSync(pathToPackageConfig, localPackageConfigJson);

fs.writeFileSync(pathToPackageConfig, packageConfigFile);

fs.writeFileSync(pkg.packageJsonPath, JSON.stringify(pkg.config, null, 2));

const packageConfig = fs.existsSync(packageJsonPath) ? require(packageJsonPath) : null;

const lockfileContent = fs.readFileSync(lockfilePath, 'utf8');

return fs.existsSync(this.localMarkerPath);

fs.writeFileSync(this.localMarkerPath, contents);

expect(fs.existsSync).toHaveBeenCalledWith(path.resolve(projectDir, 'node_modules/_local_.json'));

expect(fs.existsSync).toHaveBeenCalledWith(path.resolve(projectDir, 'node_modules/_local_.json'));

expect(fs.readFileSync).toHaveBeenCalledWith(packageJsonPath, 'utf8');

expect(fs.writeFileSync).toHaveBeenCalledWith(packageJsonPath, expectedModifiedPackageJson);

expect(fs.writeFileSync).toHaveBeenCalledWith(packageJsonPath, dummyPackageJson);

expect(fs.existsSync(buildScript)).toBe(true);

expect(fs.readFileSync).toHaveBeenCalledWith('/foo/bar/yarn.lock', 'utf8');

expect(fs.existsSync).toHaveBeenCalledWith(path.resolve(nodeModulesDir, '_local_.json'));

expect(fs.existsSync).toHaveBeenCalledWith(path.resolve(nodeModulesDir, '_local_.json'));

expect(fs.writeFileSync).toHaveBeenCalledWith(path.resolve(nodeModulesDir, '_local_.json'), 'test contents');

const configSrc = fs.existsSync(configPath) && fs.readFileSync(configPath, 'utf-8').trim();

this._boilerplatePackageJsons[exampleType] = fs.existsSync(pkgJsonPath) ? require(pkgJsonPath) : null;

fs.writeFileSync(outputFileName, html, 'utf-8');

fs.writeFileSync(altFileName, html, 'utf-8');

if (fs.existsSync(outputFileName)) {

fs.unlinkSync(outputFileName);

if (altFileName && fs.existsSync(altFileName)) {

fs.unlinkSync(altFileName);

if (!fs.existsSync(path.join(config.basePath, config.file))) {

const primaryFile = defaultPrimaryFiles.find(fileName => fs.existsSync(path.join(config.basePath, fileName)));

if (config.main && !fs.existsSync(path.join(config.basePath, config.main))) {

content = fs.readFileSync(fileName, 'utf-8');

return fs.readFileSync(file, { encoding: 'base64' });

const configSrc = fs.readFileSync(configFileName, 'utf-8');

.factory('packageInfo', function() { return require(path.resolve(PROJECT_ROOT, 'package.json')); })

wordsToIgnore = fs.readFileSync(ignoreWordsPath, 'utf8').toString().split(/[,\s\n\r]+/gm);

const gitignoreFile = fs.readFileSync(gitignoreFilePath, 'utf8');

expect(fs.writeFile).toHaveBeenCalled();

expect(fs.writeFile).toHaveBeenCalled();

expect(fs.writeFile).toHaveBeenCalled();

expect(fs.writeFile).toHaveBeenCalled();

expect(fs.writeFile).toHaveBeenCalled();

expect(fs.writeFile).toHaveBeenCalled();

expect(fs.writeFile).toHaveBeenCalled();

expect(fs.writeFile).toHaveBeenCalled();

watchr.open(CONTENTS_PATH, listener, next);

watchr.open(API_SOURCE_PATH, listener, next);

const cliPackage = require(resolve(CLI_SOURCE_PATH, 'package.json'));

const schemaJson = fs.readFileSync(schemaJsonPath);

.map(p => require(resolve(absolutePath, p)));

const pkg = new Package('mock_' + packageName, [require('../' + packageName)]);

pkg.factory('packageInfo', function() { return require(path.resolve(PROJECT_ROOT, 'package.json')); });

const buffer = fs.readFileSync(input);

fs.writeFileSync(output, compress(buffer, {mode: 0, quality: 11}));

let banner = fs.readFileSync(bannerFile, 'utf8');

const versionTag = fs.readFileSync(stampDataFile, 'utf8')

if (require.extensions['.ts'] === undefined && fs.existsSync(configPath + ".ts") &&

return require(configPath);

if (require.extensions['.ts'] === undefined && fs.existsSync(configPath + ".ts") &&

return require(configPath);

fs.writeFileSync(logFilePath, LOGGED_TEXT);

fs.writeFileSync(path.join(getRepoBaseDir(), ".ng-dev.err-" + now.getTime() + ".log"), LOGGED_TEXT);

if (!fs.existsSync(angularRobotFilePath)) {

const robotConfig = yaml.parse(fs.readFileSync(angularRobotFilePath).toString());

if (fs.existsSync(commitMessageDraftPath)) {

return fs.readFileSync(commitMessageDraftPath).toString();

if (fs.existsSync(commitMessageDraftPath)) {

fs.unlinkSync(commitMessageDraftPath);

fs.writeFileSync(`${basePath}.ngDevSave`, commitMessage);

fs.writeFileSync(filePath, commitMessage);

const commitMessage = fs.readFileSync(path.resolve(getRepoBaseDir(), filePath), 'utf8');

fs.writeFileSync(args.filePath, defaultCommitMessage);

fs.writeFileSync(args.filePath, commitMessage);

const ngBotYaml = fs.readFileSync(NGBOT_CONFIG_YAML_PATH, 'utf8');

const pullApproveYamlRaw = fs.readFileSync(PULL_APPROVE_YAML_PATH, 'utf8');

const pkgJson = JSON.parse(yield fs.promises.readFile(pkgJsonPath, 'utf8'));

yield fs.promises.writeFile(pkgJsonPath, `${JSON.stringify(pkgJson, null, 2)}\n`);

const localChangelog = yield fs.promises.readFile(localChangelogPath, 'utf8');

yield fs.promises.writeFile(localChangelogPath, releaseNotes + localChangelog);

const { version: packageJsonVersion } = JSON.parse(yield fs.promises.readFile(path.join(pkg.outputPath, 'package.json'), 'utf8'));

const { version } = require(packageJsonPath);

return fs.statSync(filePath);

const fileContent = fs.readFileSync(resolvedPath, 'utf8');

const config = require(configPath);

fs.writeFileSync(goldenFile, JSON.stringify(actual, null, 2));

else if (!fs.existsSync(goldenFile)) {

const expected = JSON.parse(fs.readFileSync(goldenFile, 'utf8'));

const taskModule = require('./tools/gulp-tasks/' + fileName);

const filesWithNgDevMode = fs.readdirSync(distPath)

.filter(p => fs.readFileSync(path.join(distPath, p), 'utf-8').includes(ngDevModeVariable));

const filesWithNgDevMode = fs.readdirSync(distPath)

.filter(p => fs.readFileSync(path.join(distPath, p), 'utf-8').includes(ngDevModeVariable));

const contents = fs.readFileSync(filePath, 'utf8');

fs.writeFileSync(filePath + '.bak', contents, 'utf8');

fs.writeFileSync(filePath, updated, 'utf8');

const contents = fs.readFileSync(filePath, 'utf8');

fs.writeFileSync(updatedFilePath, updatedContents, 'utf8');

const angularModules = fs.readdirSync(angularRoot).map(function (name) {

const content = fs.readFileSync(path.join(angularRoot, name, 'package.json'), 'utf-8').toString();

const expectedContent = fs.readFileSync(path.join(projectDir, relativeFilePath), 'utf8');

const actualContent = fs.readFileSync(path.join(projectDir, actualFilePath), 'utf8');

const entryPointPkgJson = require(entryPointPkgJsonPath);

const data = JSON.parse(fs.readFileSync(input, {encoding: 'utf-8'}));

fs.writeFileSync(output, JSON.stringify(data));

return fs.statSync(filePath).isFile();

banner = fs.readFileSync(bannerFile, {encoding: 'utf-8'});

const versionTag = fs.readFileSync(stampData, {encoding: 'utf-8'})


fs.readFileSync(require.resolve(`${PKG}/flat_module_filename.d.ts`), {encoding: 'utf-8'});

fs.symlinkSync(moduleDir, outputPath, 'junction');

return fs.readFileSync(runfilesManifestPath, 'utf8')

return shx.find(directoryPath).filter(filePath => !fs.lstatSync(filePath).isDirectory());

shell.ls(baseDir).filter((filename) => fs.statSync(path.join(baseDir, filename)).isDirectory());

fs.writeFileSync(writePath, JSON.stringify(baseTimes, undefined, 2));


const stats = fs.statSync(target.path);

const versionTag = require('fs')

const versionTag = require('fs')

fs.readFile(file, function(err, contents) {

fs.writeFile(READY_FILE, '');

const allLimitSizes = JSON.parse(fs.readFileSync(limitFile, 'utf8'));

const packageJson = require(packageJsonPath);

const puppeteerVersion = require(puppeteerPkgPath).version;

if (!fs.existsSync(path.join(__dirname, 'cldr/cldr-data'))) {

if (fs.existsSync(cldrDataFolder)) {

fs.rmdirSync(cldrDataFolder, {recursive: true});

fs.mkdirSync(cldrDataFolder);

console.log(RELATIVE_I18N_DATA_FOLDER, fs.existsSync(RELATIVE_I18N_DATA_FOLDER));

if (!fs.existsSync(RELATIVE_I18N_DATA_FOLDER)) {

const fileList = _fs.readdirSync(_path.join(__dirname, 'cldr-data', dirName));

return require('./cldr-data/' + path);

fs.writeFileSync(

if (fs.existsSync(`${RELATIVE_I18N_DATA_FOLDER}/${l}.ts`)) {

fs.readFileSync(path, 'utf8')

if (!fs.existsSync(RELATIVE_I18N_FOLDER)) {

fs.mkdirSync(RELATIVE_I18N_FOLDER);

if (!fs.existsSync(RELATIVE_I18N_DATA_FOLDER)) {

fs.mkdirSync(RELATIVE_I18N_DATA_FOLDER);

if (!fs.existsSync(RELATIVE_I18N_DATA_EXTRA_FOLDER)) {

fs.mkdirSync(RELATIVE_I18N_DATA_EXTRA_FOLDER);

if (!fs.existsSync(RELATIVE_I18N_GLOBAL_FOLDER)) {

fs.mkdirSync(RELATIVE_I18N_GLOBAL_FOLDER);

fs.writeFileSync(`${RELATIVE_I18N_FOLDER}/currencies.ts`, generateCurrenciesFile());

fs.writeFileSync(`${RELATIVE_I18N_CORE_FOLDER}/locale_en.ts`, localeEnFile);

fs.writeFileSync(

fs.writeFileSync(

fs.writeFileSync(

const inputPackageJson = JSON.parse(fs.readFileSync(inputPackageJsonPath, 'utf8'));

const basePackageJson = JSON.parse(fs.readFileSync(basePackageJsonPath, 'utf8'));

fs.writeFileSync(outputPath, JSON.stringify(result, null, 2));

if (fs.existsSync(path)) {

var subpaths = fs.readdirSync(path);

if (fs.lstatSync(curPath).isDirectory()) {

fs.unlinkSync(curPath);

fs.rmdirSync(path);

const runfiles = require(process.env['BAZEL_NODE_RUNFILES_HELPER']);

if (!fs.existsSync(p)) {


return fs.existsSync(p) && fs.statSync(p).isFile();

fs.chmodSync(dest, isExecutable(src) ? '755' : '644');

binary = (runfilesBinary && fs.existsSync(runfilesBinary)) ? runfilesBinary : binary;

const contents = JSON.parse(fs.readFileSync(packageJson, {encoding: 'utf-8'}));

fs.writeFileSync(packageJson, contentsEncoded);

fs.readFileSync(`${outputBase}/DO_NOT_BUILD_HERE`, {encoding: 'utf-8'});

fs.writeFileSync(manifestPath, JSON.stringify(manifest, null, 2));

const config = require(runfiles.resolveWorkspaceRelative(process.argv[2]));

const runfiles = require(process.env['BAZEL_NODE_RUNFILES_HELPER']);

return fs.readFileSync(path.resolve(process.cwd(), filePath), 'UTF-8');

return new Promise<number[]>((resolve, reject) => {

return new Promise<string[]>((resolve, reject) => {

const outPath = computeArtifactDownloadPath(this.downloadDir, pr, sha, artifactPath);



const leftoverDownloads = fs.readdirSync(AIO_DOWNLOADS_DIR);

const absFilePath = path.join(shaDir, relFilePath);

public writeFile(filePath: string, {content, size}: FileSpecs, force = false): void {


// Create a file with the specified content.


const pkgJsonPath = join(this.projectDir, packageJsonPath);

// to avoid unnecessary diff. IDEs usually add a trailing new line.

const localChangelogPath = getLocalChangelogFilePath(this.projectDir);

// Prepend the extracted release notes to the local changelog and write it back.

const {version: packageJsonVersion} =



const pkgJson = path.resolve(path.dirname(entryPoint), 'package.json');

if (!fs.existsSync(pkgJson)) {

// Parameters are specified in the file one per line.

shx.mkdir('-p', path.dirname(outputPath));

function copyFileFromInputPath(inputPath: string) {



function rewireMetadata(metadataPath: string, typingsPath: string): string {

function readTypingsAndStripAmdModule(filePath: string): string {

const maybeMetadataFile = importedFilePath.replace(EXT, '') + '.metadata.json';


const manifest = constructManifest(tsickleEmitResult.modulesManifest, bazelHost);



// compare these in a golden file, the order needs to be consistent across different platforms.


// compare these in a golden file, the order needs to be consistent across different platforms.

process.chdir(testPackage.packagePath);

function readFileContents(filePath: string): string {

function hashFileContents(filePath: string): string {

const bazelBinPath = path.resolve(basePath, bazelBin);

const parent = path.dirname(dirname);


const newDir = path.resolve(basePath, dir);


function read(fileName: string) {

function shouldExist(fileName: string) {

function shouldNotExist(fileName: string) {

const dir = path.join(baseDir, `tmp.${id}`);

export function listFilesRecursive(dir: string, fileList: string[] = []) {

fs.readdirSync(dir).forEach(file => {

return new Promise<void>(function(resolve, reject) {

if (this._rawPerflogPath && events.length) {

const basicFilePath = path.join(outDir, 'basic.js');

expect(fs.existsSync(basicFilePath)).toBeTruthy();

const metadataOutput = path.join(outDir, 'basic.metadata.json');

expect(fs.existsSync(metadataOutput)).toBeTruthy();

const dtsOutput = path.join(outDir, 'basic.d.ts');

expect(fs.existsSync(dtsOutput)).toBeTruthy();

const factoryOutput = path.join('node_modules', '@angular', 'common', 'common.ngfactory.js');

const xmbOutput = path.join(outputDir, 'custom_file.xmb');

expect(fs.existsSync(xmbOutput)).toBeTruthy();

const xlfOutput = path.join(outputDir, 'messages.xlf');

expect(fs.existsSync(xlfOutput)).toBeTruthy();

const xlfOutput = path.join(outputDir, 'messages.xliff2.xlf');

expect(fs.existsSync(xlfOutput)).toBeTruthy();

it('should not emit js', () => {

let basePath = fs.resolve(baseUrl, extractPathPrefix(path));


logger.debug(`Attempting to remove lock-file at ${lockFilePath}.`);

const r3SymbolsFilePath = fs.resolve(directory, filename);



fs.resolve(typingsRoot, fs.relative(formatRoot, sf.fileName.replace(/\.js$/, '.d.ts')));

if (entryPointConfig === undefined) {

fs.resolve(entryPointPath, typings.replace(/\.d\.ts$/, '') + '.metadata.json');



const typingsPath = fs.resolve(entryPointPath, relativeTypingsPath);

function readFile(absPath: AbsoluteFsPath, fs: ReadonlyFileSystem): string|undefined {


const testPath = absoluteFrom(path + postFix);


Information Disclosure (292)
Insecure Use of Language/Framework API (10)
Insecure Use of SQL Queries (30)
Insecure Processing of Data (26)
Insecure Access Control (19)
Insecure Use of Crypto (171)

👉 Go to the dashboard for detailed results.

📥 Happy? Share your feedback with us.

@kadirselcuk kadirselcuk merged commit 2797fc8 into labs/router May 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment