Rebased version of the elastic PR (#225)

* Add elastic to our docker compose

* add AND/OR/NOT search operations

* add elastic and create article in elastic

* change error code when elastic throws error

* add search pages in elastic

* add search by labels

* Add elastic to GitHub Action

* Update elastic version

* Fix port for elastic

* add url in search query

* Set elastic features when running tests

* add debug logs

* Use localhost instead of service hostname

* refresh elastic after create/update

* update search labels query

* add typescript support

* search pages in elastic

* fix search queries

* use elastic for saving page

* fix test failure

* update getArticle api to use elastic

* use generic get page function

* add elastic migration python script

* fix bulk helper param

* save elastic page id in article_saving_request instead of postgres article_id

* fix page archiving and deleting

* add tests for deleteArticle

* remove custom date type in elastic mappings which not exist in older version of elastic

* fix timestamp format issue

* add tests for save reading progress

* add tests for save file

* optimize search results

* add alias to index

* update migration script to receive env var as params

* Add failing test to validate we don't decrease reading progress

This test is failing with Elastic because we aren't fetching
the reading progress from elastic here, and are fetching it
from postgres.

* Rename readingProgress to readingProgressPercent

This is the name stored in elastic, so fixes issues pulling the
value out.

* Linting

* Add failing test for creating highlights w/elastic

This test fails because the highlight can't be looked up. Is there
a different ID we should be passing in to query for highlights,
or do we need to update the query to look for elastic_id?

* add tests code coverage threshold

* update nyc config

* include more files in test coverage

* change alias name

* update updateContent to update pages in elastic

* remove debug log

* fix createhighlight test

* search pages by alias in elastic

* update set labels and delete labels in elastic

* migration script enumeration

* make BULK_SIZE an env var

* fix pdf search indexing

* debug github action exit issue

* call pubsub when create/update/delete page in elastic

* fix json parsing bug and reduce reading data from file

* replace a depreciated pubsub api call

* debug github action exit issue

* debug github action exit issue

* add handler to upload elastic page data to GCS

* fix tests

* Use http_auth instead of basic_auth

* add index creation and existing postgres tables update in migration script

* fix a typo to connect to elastic

* rename readingProgress to readingProgressPercent

* migrate elastic_page_id in highlights and article_saving_request tables

* update migration script to include number of updated rows

* update db migration query

* read index mappings from file

* fix upload pages to gcs

* fix tests failure due to pageContext

* fix upload file id not exist error

* Handle savedAt & isArchived attributes w/out quering elastic

* Fix prettier issues

* fix content-type mismatching

* revert pageId to linkId because frontend was not deployed yet

* fix newsletters and attachment not saved in elastic

* put linkId in article for setting labels

* exclude orginalHtml in the result of searching to improve performace

* exclude content in the result of searching to improve performace

* remove score sorting

* do not refresh immediately to reduce searching and indexing time

* do not replace the backup data in gcs

* fix no article id defined in articleSavingRequest

* add logging of elastic api running time

* reduce home feed pagination size to 15

* reduce home feed pagination size to 10

* stop revalidating first page

* do not use a separate api to fetch reading progress

* Remove unused comment

* get reading progress if not exists

* replace ngram tokenizer with standard tokenizer

* fix tests

* remove .env.local

* add sort keyword in searching to sort by score

Co-authored-by: Hongbo Wu <hongbo@omnivore.app>
This commit is contained in:
Jackson Harper
2022-03-15 21:08:59 -07:00
committed by GitHub
parent c4a3edc6be
commit e652a6ea8c
51 changed files with 2582 additions and 724 deletions

View File

@ -27,6 +27,21 @@ jobs:
--health-retries 5
ports:
- 5432
elastic:
image: docker.elastic.co/elasticsearch/elasticsearch:7.12.0
env:
discovery.type: single-node
http.cors.allow-origin: '*'
http.cors.enabled: true
http.cors.allow-headers: 'X-Requested-With,X-Auth-Token,Content-Type,Content-Length,Authorization'
http.cors.allow-credentials: true
options: >-
--health-cmd "curl http://0.0.0.0:9200/_cluster/health"
--health-interval 10s
--health-timeout 5s
--health-retries 10
ports:
- 9200
steps:
- uses: actions/checkout@v2
with:
@ -54,6 +69,7 @@ jobs:
yarn build
yarn lint
yarn test
env:
PG_HOST: localhost
PG_PORT: ${{ job.services.postgres.ports[5432] }}
@ -61,3 +77,4 @@ jobs:
PG_PASSWORD: postgres
PG_DB: omnivore_test
PG_POOL_MAX: 10
ELASTIC_URL: http://localhost:${{ job.services.elastic.ports[9200] }}/

6
.gitignore vendored
View File

@ -59,3 +59,9 @@ omnivore-app-sa-gcs*
# Emacs temp files
*~
# code coverage files
.nyc_output
# data migration file
data.json

View File

@ -28,10 +28,42 @@ services:
- PG_USER=postgres
- PG_PASSWORD=postgres
- PG_DB=omnivore
- ES_URL=http://elastic:9200
- ES_USERNAME=blank
- ES_PASSWORD=blank
depends_on:
postgres:
condition: service_healthy
elastic:
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.1
container_name: "omnivore-elastic"
healthcheck:
test: curl 0.0.0.0:9200/_cat/health >/dev/null || exit 1
interval: 2s
timeout: 2s
retries: 5
environment:
- discovery.type=single-node
- http.cors.allow-origin=*
- http.cors.enabled=true
- http.cors.allow-headers=X-Requested-With,X-Auth-Token,Content-Type,Content-Length,Authorization
- http.cors.allow-credentials=true
volumes:
- ./.docker/elastic-data:/usr/share/elasticsearch/data
ports:
- "9200:9200"
kibana:
image: docker.elastic.co/kibana/kibana:7.17.1
container_name: "omnivore-kibana"
environment:
- ELASTICSEARCH_HOSTS=http://elastic:9200
depends_on:
- elastic
ports:
- "5601:5601"
api:
build:
context: .
@ -62,6 +94,8 @@ services:
depends_on:
migrate:
condition: service_completed_successfully
elastic:
condition: service_healthy
web:
build:

15
packages/api/.nycrc Normal file
View File

@ -0,0 +1,15 @@
{
"extends": "@istanbuljs/nyc-config-typescript",
"check-coverage": true,
"all": true,
"include": [
"src/**/*.ts"
],
"reporter": [
"text-summary"
],
"branches": 0,
"lines": 0,
"functions": 0,
"statements": 0
}

View File

@ -0,0 +1,199 @@
#!/usr/bin/python
import os
import json
import psycopg2
from psycopg2.extras import RealDictCursor
from elasticsearch import Elasticsearch, helpers
PG_HOST = os.getenv('PG_HOST', 'localhost')
PG_PORT = os.getenv('PG_PORT', 5432)
PG_USER = os.getenv('PG_USER', 'app_user')
PG_PASSWORD = os.getenv('PG_PASSWORD', 'app_pass')
PG_DB = os.getenv('PG_DB', 'omnivore')
ES_URL = os.getenv('ES_URL', 'http://localhost:9200')
ES_USERNAME = os.getenv('ES_USERNAME', 'elastic')
ES_PASSWORD = os.getenv('ES_PASSWORD', 'password')
DATA_FILE = os.getenv('DATA_FILE', 'data.json')
BULK_SIZE = os.getenv('BULK_SIZE', 100)
UPDATE_TIME = os.getenv('UPDATE_TIME', '2019-01-01 00:00:00')
INDEX_SETTINGS = os.getenv('INDEX_SETTINGS', 'index_settings.json')
DATETIME_FORMAT = 'YYYY-MM-DD"T"HH24:MI:SS.MS"Z"'
QUERY = f'''
SELECT
p.id,
title,
description,
to_char(l.created_at, '{DATETIME_FORMAT}') as "createdAt",
to_char(l.updated_at, '{DATETIME_FORMAT}') as "updatedAt",
url,
hash,
original_html as "originalHtml",
content,
author,
image,
to_char(published_at, '{DATETIME_FORMAT}') as "publishedAt",
upload_file_id as "uploadFileId",
page_type as "pageType",
user_id as "userId",
to_char(shared_at, '{DATETIME_FORMAT}') as "sharedAt",
article_reading_progress as "readingProgressPercent",
article_reading_progress_anchor_index as "readingProgressAnchorIndex",
to_char(saved_at, '{DATETIME_FORMAT}') as "savedAt",
slug,
to_char(archived_at, '{DATETIME_FORMAT}') as "archivedAt"
FROM omnivore.pages p
LEFT JOIN omnivore.links l ON p.id = l.article_id
WHERE p.updated_at > '{UPDATE_TIME}'
'''
UPDATE_ARTICLE_SAVING_REQUEST_SQL = f'''
UPDATE omnivore.article_saving_request
SET elastic_page_id = article_id
WHERE elastic_page_id is NULL
AND article_id is NOT NULL
AND updated_at > '{UPDATE_TIME}';
'''
UPDATE_HIGHLIGHT_SQL = f'''
UPDATE omnivore.highlight
SET elastic_page_id = article_id
WHERE elastic_page_id is NULL
AND article_id is NOT NULL
AND updated_at > '{UPDATE_TIME}';
'''
def create_index(client):
print('Creating index')
try:
# check if index exists
if client.indices.exists(index='pages_alias'):
print('Index already exists')
return
# create index
with open(INDEX_SETTINGS, 'r') as f:
settings = json.load(f)
client.indices.create(index='pages', body=settings)
print('Index created')
except Exception as err:
print('Create index ERROR:', err)
exit(1)
def update_postgres_data(conn, query, table):
try:
print('Executing query: {}'.format(query))
# update data in postgres
cursor = conn.cursor()
cursor.execute(query)
count = cursor.rowcount
conn.commit()
cursor.close()
print(f'Updated {table} in postgres, rows: ', count)
except Exception as err:
print('Update postgres data ERROR:', err)
def elastic_bulk_insert(client, doc_list) -> int:
print('Bulk docs length:', len(doc_list))
try:
# use the helpers library's Bulk API to index list of
# Elasticsearch docs
resp = helpers.bulk(
client,
doc_list,
request_timeout=30,
)
# print the response returned by Elasticsearch
print('helpers.bulk() RESPONSE:',
json.dumps(resp, indent=2))
return resp[0]
except Exception as err:
print('Elasticsearch helpers.bulk() ERROR:', err)
return 0
def ingest_data_to_elastic(conn, query, data_file):
try:
print('Executing query: {}'.format(query))
# export data from postgres
count = 0
import_count = 0
with open(data_file, 'w') as f:
cursor = conn.cursor(cursor_factory=RealDictCursor)
cursor.execute(query)
result = cursor.fetchmany(BULK_SIZE)
while len(result) > 0:
print(f'Writing {len(result)} docs to file')
import_count += import_data_to_es(client, result)
count += len(result)
f.write(json.dumps(result, indent=2, default=str))
result = cursor.fetchmany(BULK_SIZE)
cursor.close()
print(f'Exported {count} rows to data.json')
print(f'Imported {import_count} rows to es')
except Exception as err:
print('Export data to json ERROR:', err)
def import_data_to_es(client, docs) -> int:
# import data to elasticsearch
print('Importing docs to elasticsearch:', len(docs))
if len(docs) == 0:
print('No data to import')
return 0
print('Attempting to index the list of docs using helpers.bulk()')
doc_list = []
for doc in docs:
# convert the string to a dict object
dict_doc = {
'_index': 'pages',
'_id': doc['id'],
'_source': doc
}
doc_list += [dict_doc]
count = elastic_bulk_insert(client, doc_list)
print(f'Imported {count} docs to elasticsearch')
return count
print('Starting migration')
# test elastic client
client = Elasticsearch(ES_URL, http_auth=(
ES_USERNAME, ES_PASSWORD), retry_on_timeout=True)
try:
print('Elasticsearch client connected', client.info())
except Exception as err:
print('Elasticsearch client ERROR:', err)
exit(1)
# test postgres client
conn = psycopg2.connect(
f'host={PG_HOST} port={PG_PORT} dbname={PG_DB} user={PG_USER} \
password={PG_PASSWORD}')
print('Postgres connection:', conn.info)
create_index(client)
# ingest data from postgres to es and json file (for debugging)
ingest_data_to_elastic(conn, QUERY, DATA_FILE)
# update existing tables
update_postgres_data(conn, UPDATE_ARTICLE_SAVING_REQUEST_SQL,
'article_saving_request')
update_postgres_data(conn, UPDATE_HIGHLIGHT_SQL, 'highlight')
client.close()
conn.close()
print('Migration complete')

View File

@ -0,0 +1,79 @@
{
"aliases": {
"pages_alias": {}
},
"settings": {
"analysis": {
"analyzer": {
"strip_html_analyzer": {
"tokenizer": "standard",
"char_filter": ["html_strip"]
}
},
"normalizer": {
"lowercase_normalizer": {
"filter": ["lowercase"]
}
}
}
},
"mappings": {
"properties": {
"userId": {
"type": "keyword"
},
"title": {
"type": "text"
},
"author": {
"type": "text"
},
"description": {
"type": "text"
},
"content": {
"type": "text",
"analyzer": "strip_html_analyzer"
},
"url": {
"type": "keyword"
},
"uploadFileId": {
"type": "keyword"
},
"pageType": {
"type": "keyword"
},
"slug": {
"type": "keyword"
},
"labels": {
"type": "nested",
"properties": {
"name": {
"type": "keyword",
"normalizer": "lowercase_normalizer"
}
}
},
"readingProgressPercent": {
"type": "float"
},
"readingProgressAnchorIndex": {
"type": "integer"
},
"createdAt": {
"type": "date"
},
"savedAt": {
"type": "date"
},
"archivedAt": {
"type": "date"
},
"siteName": {
"type": "text"
}
}
}
}

View File

@ -8,9 +8,10 @@
"start": "node dist/server.js",
"lint": "eslint src --ext ts,js,tsx,jsx",
"lint:fix": "eslint src --fix --ext ts,js,tsx,jsx",
"test": "yarn mocha -r ts-node/register --config mocha-config.json --exit --timeout 10000"
"test": "nyc mocha -r ts-node/register --config mocha-config.json --exit --timeout 10000"
},
"dependencies": {
"@elastic/elasticsearch": "~7.12.0",
"@google-cloud/logging-winston": "^4.1.2",
"@google-cloud/monitoring": "^2.3.5",
"@google-cloud/opentelemetry-cloud-trace-exporter": "^1.1.0",
@ -101,6 +102,7 @@
},
"devDependencies": {
"@babel/register": "^7.14.5",
"@istanbuljs/nyc-config-typescript": "^1.0.2",
"@types/analytics-node": "^3.1.7",
"@types/highlightjs": "^9.12.2",
"@types/nanoid": "^3.0.0",
@ -110,6 +112,7 @@
"circular-dependency-plugin": "^5.2.0",
"mocha": "^9.0.1",
"nock": "^13.2.4",
"nyc": "^15.1.0",
"postgrator": "^4.2.0",
"ts-node-dev": "^1.1.8"
},

View File

@ -39,25 +39,6 @@ class ArticleModel extends DataModel<ArticleData, CreateSet, UpdateSet> {
this.loader.prime(row.id, row)
return row
}
@logMethod
async updateContent(
id: string,
content: string,
title?: string,
author?: string,
description?: string,
tx = this.kx
): Promise<boolean> {
const items = { content, title, author, description }
if (!title) delete items.title
if (!author) delete items.author
if (!description) delete items.description
const stmt = tx(this.tableName).update(items).where({ id }).returning('*')
const [row] = await stmt
return !!row
}
}
export default ArticleModel

View File

@ -24,6 +24,7 @@ export interface ArticleSavingRequestData {
createdAt: Date
updatedAt: Date
taskName?: string
elasticPageId?: string
}
export const keys = [
@ -35,6 +36,7 @@ export const keys = [
'createdAt',
'updatedAt',
'taskName',
'elasticPageId',
] as const
export const defaultedKeys = ['id', 'createdAt', 'updatedAt', 'status'] as const
@ -51,6 +53,7 @@ export const updateKeys = [
'status',
'errorCode',
'taskName',
'elasticPageId',
] as const
export type UpdateSet = PickTuple<ArticleSavingRequestData, typeof updateKeys>

View File

@ -62,7 +62,7 @@ class HighlightModel extends DataModel<HighlightData, CreateSet, UpdateSet> {
const result = await this.kx(Table.HIGHLIGHT)
.select(modelKeys)
.whereIn('articleId', articleIds)
.whereIn('elasticPageId', articleIds)
.andWhere('deleted', false)
.orderBy(`${Table.HIGHLIGHT}.created_at`, 'desc')
.limit(MAX_RECORDS_LIMIT)
@ -79,7 +79,7 @@ class HighlightModel extends DataModel<HighlightData, CreateSet, UpdateSet> {
)
highlights.forEach((highlight) => {
const index = positions[highlight.articleId]
const index = positions[highlight.elasticPageId]
result[index].push({
...highlight,
updatedAt: highlight.updatedAt || highlight.createdAt,
@ -108,7 +108,7 @@ class HighlightModel extends DataModel<HighlightData, CreateSet, UpdateSet> {
): Promise<HighlightData[]> {
const rows: HighlightData[] = await tx(this.tableName)
.update({ sharedAt: null })
.where({ articleId, userId })
.where({ elasticPageId: articleId, userId })
.andWhere(tx.raw(`shared_at is not null`))
.returning(this.modelKeys)
@ -156,7 +156,7 @@ class HighlightModel extends DataModel<HighlightData, CreateSet, UpdateSet> {
const highlights = await this.kx(Table.HIGHLIGHT)
.select(modelKeys)
.where('user_id', userId)
.andWhere('article_id', articleId)
.andWhere('elastic_page_id', articleId)
.andWhere('deleted', false)
.orderBy(`${Table.HIGHLIGHT}.created_at`, 'desc')
.limit(MAX_RECORDS_LIMIT)

View File

@ -24,7 +24,7 @@ export interface HighlightData {
id: string
shortId: string
userId: string
articleId: string
articleId?: string
quote: string
prefix?: string | null
suffix?: string | null
@ -34,6 +34,7 @@ export interface HighlightData {
createdAt: Date
updatedAt?: Date | null
sharedAt?: Date | null
elasticPageId: string
}
export const keys = [
@ -50,6 +51,7 @@ export const keys = [
'createdAt',
'updatedAt',
'sharedAt',
'elasticPageId',
] as const
export const defaultedKeys = [

View File

@ -23,7 +23,7 @@ import {
import { ENABLE_DB_REQUEST_LOGGING, globalCounter, logMethod } from '../helpers'
import DataLoader from 'dataloader'
import { ArticleData } from '../article/model'
import { InFilter, ReadFilter } from '../../utils/search'
import { InFilter, LabelFilter, ReadFilter } from '../../utils/search'
type PartialArticle = Omit<
Article,
@ -404,7 +404,7 @@ class UserArticleModel extends DataModel<
inFilter: InFilter
readFilter: ReadFilter
typeFilter: PageType | undefined
labelFilters?: string[]
labelFilters: LabelFilter[]
},
userId: string,
tx = this.kx,
@ -443,15 +443,6 @@ class UserArticleModel extends DataModel<
}
}
// search by labels using lowercase
if (args.labelFilters) {
queryPromise
.innerJoin(Table.LINK_LABELS, 'link_labels.link_id', 'links.id')
.innerJoin(Table.LABELS, 'labels.id', 'link_labels.label_id')
.whereRaw('LOWER(omnivore.labels.name) = ANY(?)', [args.labelFilters])
.distinct('links.id')
}
if (notNullField) {
queryPromise.whereNotNull(notNullField)
}

View File

@ -3,6 +3,7 @@
import Knex from 'knex'
import { LinkShareInfo } from '../../generated/graphql'
import { DataModels } from '../../resolvers/types'
import { getPageByParam } from '../../elastic'
// once we have links setup properly in the API we will remove this method
// and have a getShareInfoForLink method
@ -15,15 +16,15 @@ export const getShareInfoForArticle = async (
// TEMP: because the old API uses articles instead of Links, we are actually
// getting an article ID here and need to map it to a link ID. When the API
// is updated to use Links instead of Articles this will be removed.
const link = await models.userArticle.getByArticleId(userId, articleId, kx)
const page = await getPageByParam({ userId, _id: articleId })
if (!link) {
if (!page) {
return undefined
}
const result = await kx('omnivore.link_share_info')
.select('*')
.where({ linkId: link.id })
.where({ elastic_page_id: page.id })
.first()
return result

View File

@ -2,6 +2,7 @@ import { PubSub } from '@google-cloud/pubsub'
import { env } from '../env'
import { ReportType } from '../generated/graphql'
import express from 'express'
import { Page } from '../elastic/types'
export const createPubSubClient = (): PubsubClient => {
const client = new PubSub()
@ -15,7 +16,7 @@ export const createPubSubClient = (): PubsubClient => {
console.log(`Publishing ${topicName}`, msg)
return client
.topic(topicName)
.publish(msg)
.publishMessage({ data: msg })
.catch((err) => {
console.error(`[PubSub] error: ${topicName}`, err)
})
@ -36,25 +37,14 @@ export const createPubSubClient = (): PubsubClient => {
Buffer.from(JSON.stringify({ userId, email, name, username }))
)
},
pageSaved: (
userId: string,
url: string,
content: string
): Promise<void> => {
return publish(
'pageSaved',
Buffer.from(JSON.stringify({ userId, url, content }))
)
pageSaved: (page: Partial<Page>): Promise<void> => {
return publish('pageSaved', Buffer.from(JSON.stringify(page)))
},
pageCreated: (
userId: string | undefined,
url: string,
content: string
): Promise<void> => {
return publish(
'pageCreated',
Buffer.from(JSON.stringify({ userId, url, content }))
)
pageCreated: (page: Page): Promise<void> => {
return publish('pageCreated', Buffer.from(JSON.stringify(page)))
},
pageDeleted: (id: string): Promise<void> => {
return publish('pageDeleted', Buffer.from(JSON.stringify({ id })))
},
reportSubmitted: (
submitterId: string,
@ -79,8 +69,9 @@ export interface PubsubClient {
name: string,
username: string
) => Promise<void>
pageCreated: (userId: string, url: string, content: string) => Promise<void>
pageSaved: (userId: string, url: string, content: string) => Promise<void>
pageCreated: (page: Page) => Promise<void>
pageSaved: (page: Partial<Page>) => Promise<void>
pageDeleted: (id: string) => Promise<void>
reportSubmitted(
submitterId: string | undefined,
itemUrl: string,

View File

@ -0,0 +1,458 @@
/* eslint-disable @typescript-eslint/no-unsafe-member-access */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
/* eslint-disable @typescript-eslint/no-unsafe-call */
import { env } from '../env'
import { Client } from '@elastic/elasticsearch'
import { PageType, SortBy, SortOrder, SortParams } from '../generated/graphql'
import {
InFilter,
LabelFilter,
LabelFilterType,
ReadFilter,
} from '../utils/search'
import {
Page,
PageContext,
ParamSet,
SearchBody,
SearchResponse,
} from './types'
import { readFileSync } from 'fs'
import { join } from 'path'
const INDEX_NAME = 'pages'
const INDEX_ALIAS = 'pages_alias'
const client = new Client({
node: env.elastic.url,
maxRetries: 3,
requestTimeout: 50000,
auth: {
username: env.elastic.username,
password: env.elastic.password,
},
})
const ingest = async (): Promise<void> => {
// read index settings from file
const indexSettings = readFileSync(
join(__dirname, '..', '..', 'index_settings.json'),
'utf8'
)
// create index
await client.indices.create({
index: INDEX_NAME,
body: indexSettings,
})
}
const appendQuery = (body: SearchBody, query: string): void => {
body.query.bool.should.push({
multi_match: {
query,
fields: ['title', 'content', 'author', 'description', 'siteName'],
operator: 'and',
type: 'cross_fields',
},
})
body.query.bool.minimum_should_match = 1
}
const appendTypeFilter = (body: SearchBody, filter: PageType): void => {
body.query.bool.filter.push({
term: {
pageType: filter,
},
})
}
const appendReadFilter = (body: SearchBody, filter: ReadFilter): void => {
switch (filter) {
case ReadFilter.UNREAD:
body.query.bool.filter.push({
range: {
readingProgress: {
gte: 98,
},
},
})
break
case ReadFilter.READ:
body.query.bool.filter.push({
range: {
readingProgress: {
lt: 98,
},
},
})
}
}
const appendInFilter = (body: SearchBody, filter: InFilter): void => {
switch (filter) {
case InFilter.ARCHIVE:
body.query.bool.filter.push({
exists: {
field: 'archivedAt',
},
})
break
case InFilter.INBOX:
body.query.bool.must_not.push({
exists: {
field: 'archivedAt',
},
})
}
}
const appendNotNullField = (body: SearchBody, field: string): void => {
body.query.bool.filter.push({
exists: {
field,
},
})
}
const appendExcludeLabelFilter = (
body: SearchBody,
filters: LabelFilter[]
): void => {
body.query.bool.must_not.push({
nested: {
path: 'labels',
query: filters.map((filter) => {
return {
terms: {
'labels.name': filter.labels,
},
}
}),
},
})
}
const appendIncludeLabelFilter = (
body: SearchBody,
filters: LabelFilter[]
): void => {
body.query.bool.filter.push({
nested: {
path: 'labels',
query: {
bool: {
filter: filters.map((filter) => {
return {
terms: {
'labels.name': filter.labels,
},
}
}),
},
},
},
})
}
export const createPage = async (
page: Page,
ctx: PageContext
): Promise<string | undefined> => {
try {
const { body } = await client.index({
id: page.id || undefined,
index: INDEX_ALIAS,
body: page,
refresh: ctx.refresh,
})
await ctx.pubsub.pageCreated(page)
return body._id as string
} catch (e) {
console.error('failed to create a page in elastic', e)
return undefined
}
}
export const updatePage = async (
id: string,
page: Partial<Page>,
ctx: PageContext
): Promise<boolean> => {
try {
const { body } = await client.update({
index: INDEX_ALIAS,
id,
body: {
doc: page,
},
refresh: ctx.refresh,
})
if (body.result !== 'updated') return false
await ctx.pubsub.pageSaved(page)
return true
} catch (e) {
console.error('failed to update a page in elastic', e)
return false
}
}
export const deletePage = async (
id: string,
ctx: PageContext
): Promise<boolean> => {
try {
const { body } = await client.delete({
index: INDEX_ALIAS,
id,
refresh: ctx.refresh,
})
if (body.deleted === 0) return false
await ctx.pubsub.pageDeleted(id)
return true
} catch (e) {
console.error('failed to delete a page in elastic', e)
return false
}
}
export const deleteLabelInPages = async (
userId: string,
label: string,
ctx: PageContext
): Promise<void> => {
try {
await client.updateByQuery({
index: INDEX_ALIAS,
body: {
script: {
source:
'ctx._source.labels.removeIf(label -> label.name == params.label)',
lang: 'painless',
params: {
label: label,
},
},
query: {
bool: {
filter: [
{
term: {
userId,
},
},
{
nested: {
path: 'labels',
query: {
term: {
'labels.name': label,
},
},
},
},
],
},
},
},
refresh: ctx.refresh,
})
} catch (e) {
console.error('failed to delete a page in elastic', e)
}
}
export const getPageByParam = async <K extends keyof ParamSet>(
param: Record<K, Page[K]>
): Promise<Page | undefined> => {
try {
const params = {
query: {
bool: {
filter: Object.keys(param).map((key) => {
return {
term: {
[key]: param[key as K],
},
}
}),
},
},
size: 1,
_source: {
excludes: ['originalHtml'],
},
}
const { body } = await client.search({
index: INDEX_ALIAS,
body: params,
})
if (body.hits.total.value === 0) {
return undefined
}
return {
...body.hits.hits[0]._source,
id: body.hits.hits[0]._id,
} as Page
} catch (e) {
console.log('failed to search pages in elastic', e)
return undefined
}
}
export const getPageById = async (id: string): Promise<Page | undefined> => {
try {
const { body } = await client.get({
index: INDEX_ALIAS,
id,
})
return {
...body._source,
id: body._id,
} as Page
} catch (e) {
console.error('failed to search pages in elastic', e)
return undefined
}
}
export const searchPages = async (
args: {
from?: number
size?: number
sort?: SortParams
query?: string
inFilter: InFilter
readFilter: ReadFilter
typeFilter?: PageType
labelFilters: LabelFilter[]
},
userId: string,
notNullField: string | null = null
): Promise<[Page[], number] | undefined> => {
try {
const {
from = 0,
size = 10,
sort,
query,
readFilter,
typeFilter,
labelFilters,
inFilter,
} = args
const sortOrder = sort?.order === SortOrder.Ascending ? 'asc' : 'desc'
const sortField = sort?.by === SortBy.Score ? '_score' : 'updatedAt'
const includeLabels = labelFilters.filter(
(filter) => filter.type === LabelFilterType.INCLUDE
)
const excludeLabels = labelFilters.filter(
(filter) => filter.type === LabelFilterType.EXCLUDE
)
const body: SearchBody = {
query: {
bool: {
filter: [
{
term: {
userId,
},
},
],
should: [],
must_not: [],
},
},
sort: [
{
[sortField]: {
order: sortOrder,
},
},
],
from,
size,
_source: {
excludes: ['originalHtml', 'content'],
},
}
// append filters
if (query) {
appendQuery(body, query)
}
if (typeFilter) {
appendTypeFilter(body, typeFilter)
}
if (inFilter !== InFilter.ALL) {
appendInFilter(body, inFilter)
}
if (readFilter !== ReadFilter.ALL) {
appendReadFilter(body, readFilter)
}
if (notNullField) {
appendNotNullField(body, notNullField)
}
if (includeLabels.length > 0) {
appendIncludeLabelFilter(body, includeLabels)
}
if (excludeLabels.length > 0) {
appendExcludeLabelFilter(body, excludeLabels)
}
console.log('searching pages in elastic', JSON.stringify(body))
const response = await client.search<SearchResponse<Page>, SearchBody>({
index: INDEX_ALIAS,
body,
})
if (response.body.hits.total.value === 0) {
return [[], 0]
}
return [
response.body.hits.hits.map((hit: { _source: Page; _id: string }) => ({
...hit._source,
id: hit._id,
})),
response.body.hits.total.value,
]
} catch (e) {
console.error('failed to search pages in elastic', e)
return undefined
}
}
export const initElasticsearch = async (): Promise<void> => {
try {
const response = await client.info()
console.log('elastic info: ', response)
// check if index exists
const { body: indexExists } = await client.indices.exists({
index: INDEX_ALIAS,
})
if (!indexExists) {
console.log('ingesting index...')
await ingest()
await client.indices.refresh({ index: INDEX_ALIAS })
}
console.log('elastic client is ready')
} catch (e) {
console.error('failed to init elasticsearch', e)
throw e
}
}

View File

@ -0,0 +1,154 @@
// Define the type of the body for the Search request
import { Label, PageType } from '../generated/graphql'
import { PickTuple } from '../util'
import { PubsubClient } from '../datalayer/pubsub'
export interface SearchBody {
query: {
bool: {
filter: (
| {
term: {
userId: string
}
}
| { term: { pageType: string } }
| { exists: { field: string } }
| {
range: {
readingProgress: { gte: number } | { lt: number }
}
}
| {
nested: {
path: 'labels'
query: {
bool: {
filter: {
terms: {
'labels.name': string[]
}
}[]
}
}
}
}
)[]
should: {
multi_match: {
query: string
fields: string[]
operator: 'and' | 'or'
type:
| 'best_fields'
| 'most_fields'
| 'cross_fields'
| 'phrase'
| 'phrase_prefix'
}
}[]
minimum_should_match?: number
must_not: (
| {
exists: {
field: string
}
}
| {
nested: {
path: 'labels'
query: {
terms: {
'labels.name': string[]
}
}[]
}
}
)[]
}
}
sort: [Record<string, { order: string }>]
from: number
size: number
_source: {
excludes: string[]
}
}
// Complete definition of the Search response
export interface ShardsResponse {
total: number
successful: number
failed: number
skipped: number
}
export interface Explanation {
value: number
description: string
details: Explanation[]
}
export interface SearchResponse<T> {
took: number
timed_out: boolean
_scroll_id?: string
_shards: ShardsResponse
hits: {
total: {
value: number
}
max_score: number
hits: Array<{
_index: string
_type: string
_id: string
_score: number
_source: T
_version?: number
_explanation?: Explanation
fields?: never
highlight?: never
inner_hits?: never
matched_queries?: string[]
sort?: string[]
}>
}
aggregations?: never
}
export interface Page {
id: string
userId: string
title: string
author?: string
description?: string
content: string
url: string
hash: string
uploadFileId?: string | null
image?: string
pageType: PageType
originalHtml?: string | null
slug: string
labels?: Label[]
readingProgressPercent?: number
readingProgressAnchorIndex?: number
createdAt: Date
updatedAt?: Date
publishedAt?: Date
savedAt?: Date
sharedAt?: Date
archivedAt?: Date | null
siteName?: string
_id?: string
}
const keys = ['_id', 'url', 'slug', 'userId', 'uploadFileId'] as const
export type ParamSet = PickTuple<Page, typeof keys>
export interface PageContext {
pubsub: PubsubClient
refresh?: boolean
}

View File

@ -0,0 +1,39 @@
import {
BaseEntity,
Column,
CreateDateColumn,
Entity,
JoinColumn,
ManyToOne,
PrimaryGeneratedColumn,
UpdateDateColumn,
} from 'typeorm'
import { User } from './user'
@Entity({ name: 'upload_files' })
export class UploadFile extends BaseEntity {
@PrimaryGeneratedColumn('uuid')
id!: string
@ManyToOne(() => User)
@JoinColumn({ name: 'user_id' })
user!: User
@Column('text')
url!: string
@Column('text')
fileName!: string
@Column('text')
contentType!: string
@Column('text')
status!: string
@CreateDateColumn()
createdAt?: Date
@UpdateDateColumn()
updatedAt?: Date
}

View File

@ -68,6 +68,7 @@ export type Article = {
shareInfo?: Maybe<LinkShareInfo>;
slug: Scalars['String'];
title: Scalars['String'];
uploadFileId?: Maybe<Scalars['ID']>;
url: Scalars['String'];
};
@ -168,6 +169,7 @@ export type CreateArticleError = {
};
export enum CreateArticleErrorCode {
ElasticError = 'ELASTIC_ERROR',
NotAllowedToParse = 'NOT_ALLOWED_TO_PARSE',
PayloadTooLarge = 'PAYLOAD_TOO_LARGE',
UnableToFetch = 'UNABLE_TO_FETCH',
@ -629,7 +631,7 @@ export type HighlightStats = {
export type Label = {
__typename?: 'Label';
color: Scalars['String'];
createdAt: Scalars['Date'];
createdAt?: Maybe<Scalars['Date']>;
description?: Maybe<Scalars['String']>;
id: Scalars['ID'];
name: Scalars['String'];
@ -1510,6 +1512,7 @@ export type SignupSuccess = {
};
export enum SortBy {
Score = 'SCORE',
UpdatedTime = 'UPDATED_TIME'
}
@ -2404,6 +2407,7 @@ export type ArticleResolvers<ContextType = ResolverContext, ParentType extends R
shareInfo?: Resolver<Maybe<ResolversTypes['LinkShareInfo']>, ParentType, ContextType>;
slug?: Resolver<ResolversTypes['String'], ParentType, ContextType>;
title?: Resolver<ResolversTypes['String'], ParentType, ContextType>;
uploadFileId?: Resolver<Maybe<ResolversTypes['ID']>, ParentType, ContextType>;
url?: Resolver<ResolversTypes['String'], ParentType, ContextType>;
__isTypeOf?: IsTypeOfResolverFn<ParentType, ContextType>;
};
@ -2805,7 +2809,7 @@ export type HighlightStatsResolvers<ContextType = ResolverContext, ParentType ex
export type LabelResolvers<ContextType = ResolverContext, ParentType extends ResolversParentTypes['Label'] = ResolversParentTypes['Label']> = {
color?: Resolver<ResolversTypes['String'], ParentType, ContextType>;
createdAt?: Resolver<ResolversTypes['Date'], ParentType, ContextType>;
createdAt?: Resolver<Maybe<ResolversTypes['Date']>, ParentType, ContextType>;
description?: Resolver<Maybe<ResolversTypes['String']>, ParentType, ContextType>;
id?: Resolver<ResolversTypes['ID'], ParentType, ContextType>;
name?: Resolver<ResolversTypes['String'], ParentType, ContextType>;

View File

@ -49,6 +49,7 @@ type Article {
shareInfo: LinkShareInfo
slug: String!
title: String!
uploadFileId: ID
url: String!
}
@ -134,6 +135,7 @@ type CreateArticleError {
}
enum CreateArticleErrorCode {
ELASTIC_ERROR
NOT_ALLOWED_TO_PARSE
PAYLOAD_TOO_LARGE
UNABLE_TO_FETCH
@ -553,7 +555,7 @@ type HighlightStats {
type Label {
color: String!
createdAt: Date!
createdAt: Date
description: String
id: ID!
name: String!
@ -1131,6 +1133,7 @@ type SignupSuccess {
}
enum SortBy {
SCORE
UPDATED_TIME
}

View File

@ -61,13 +61,21 @@ import { traceAs } from '../../tracing'
import { createImageProxyUrl } from '../../utils/imageproxy'
import normalizeUrl from 'normalize-url'
import { WithDataSourcesContext } from '../types'
import { UserArticleData } from '../../datalayer/links/model'
import { parseSearchQuery } from '../../utils/search'
import { createPageSaveRequest } from '../../services/create_page_save_request'
import { createIntercomEvent } from '../../utils/intercom'
import { analytics } from '../../utils/analytics'
import { env } from '../../env'
import {
createPage,
deletePage,
getPageById,
getPageByParam,
searchPages,
updatePage,
} from '../../elastic'
import { Page } from '../../elastic/types'
export type PartialArticle = Omit<
Article,
@ -240,7 +248,9 @@ export const createArticleResolver = authorized<
const saveTime = new Date()
const slug = generateSlug(parsedContent?.title || croppedPathname)
const articleToSave = {
let articleToSave: Page = {
id: '',
userId: uid,
originalHtml: domContent,
content: parsedContent?.content || '',
description: parsedContent?.excerpt,
@ -258,10 +268,10 @@ export const createArticleResolver = authorized<
image: parsedContent?.previewImage,
publishedAt: validatedDate(parsedContent?.publishedDate),
uploadFileId: uploadFileId,
}
if (canonicalUrl && domContent) {
await ctx.pubsub.pageSaved(uid, canonicalUrl, domContent)
slug,
createdAt: saveTime,
savedAt: saveTime,
siteName: parsedContent?.siteName,
}
let archive = false
@ -275,114 +285,6 @@ export const createArticleResolver = authorized<
}
}
const existingArticle = await models.article.getByUrlAndHash({
url: articleToSave.url,
hash: articleToSave.hash,
})
if (existingArticle) {
let uploadFileUrlOverride = ''
if (existingArticle.uploadFileId) {
/* Grab the public URL from the existing uploaded Article, we ignore
* the duplicate uploaded file record, leave it in initialized/unused
* state (the duplicate uploaded file in GCS will remain private).
*/
const uploadFileData = await models.uploadFile.get(
existingArticle.uploadFileId
)
uploadFileUrlOverride = await makeStorageFilePublic(
uploadFileData.id,
uploadFileData.fileName
)
}
// Checking if either a user article pointing to existing article or with the same URL exists
const matchedUserArticleRecord =
(await models.userArticle.getByParameters(uid, {
articleId: existingArticle.id,
})) ||
(await models.userArticle.getByParameters(uid, {
articleUrl: articleToSave.url,
}))
if (matchedUserArticleRecord) {
log.info('Article exists, updating user article', {
existingArticleId: existingArticle.id,
matchedUserArticleRecord,
uploadFileId: existingArticle.uploadFileId,
uploadFileUrlOverride,
labels: {
source: 'resolver',
resolver: 'createArticleResolver',
articleId: existingArticle.id,
userId: uid,
},
})
// If the user already has the article saved, we just update savedAt
// and we unarchive the article by updating archivedAt, unless they have
// a reminder set to archive the article.
const updatedArticle = await authTrx((tx) =>
models.userArticle.update(
matchedUserArticleRecord.id,
{
slug: slug,
archivedAt: archive ? saveTime : null,
savedAt: saveTime,
articleId: existingArticle.id,
articleUrl: uploadFileUrlOverride || existingArticle.url,
articleHash: existingArticle.hash,
},
tx
)
)
const createdArticle = { ...existingArticle, ...updatedArticle }
return articleSavingRequestPopulate(
{
user,
created: false,
createdArticle: createdArticle,
},
ctx,
articleSavingRequest?.id,
createdArticle.articleId || undefined
)
} else {
log.info('Article exists, creating user article', {
existingArticleId: existingArticle.id,
uploadFileId: existingArticle.uploadFileId,
uploadFileUrlOverride,
labels: {
source: 'resolver',
resolver: 'createArticleResolver',
articleId: existingArticle.id,
userId: uid,
},
})
const updatedArticle = await models.userArticle.create({
slug: slug,
userId: uid,
articleId: existingArticle.id,
articleUrl: uploadFileUrlOverride || existingArticle.url,
articleHash: existingArticle.hash,
})
const createdArticle = { ...existingArticle, ...updatedArticle }
return articleSavingRequestPopulate(
{
user,
created: false,
createdArticle: createdArticle,
},
ctx,
articleSavingRequest?.id,
createdArticle.articleId || undefined
)
}
}
log.info('New article saving', {
parsedArticle: Object.assign({}, articleToSave, {
content: undefined,
@ -416,48 +318,43 @@ export const createArticleResolver = authorized<
)
}
const [articleRecord, matchedUserArticleRecord] = await Promise.all([
models.article.create(articleToSave),
models.userArticle.getByParameters(uid, {
articleUrl: articleToSave.url,
}),
])
const existingPage = await getPageByParam({
userId: uid,
url: articleToSave.url,
})
if (existingPage) {
// update existing page in elastic
existingPage.slug = slug
existingPage.savedAt = saveTime
existingPage.archivedAt = archive ? saveTime : undefined
existingPage.url = uploadFileUrlOverride || articleToSave.url
existingPage.hash = articleToSave.hash
if (!matchedUserArticleRecord && canonicalUrl && domContent) {
await ctx.pubsub.pageCreated(uid, canonicalUrl, domContent)
await updatePage(existingPage.id, existingPage, ctx)
log.info('page updated in elastic', existingPage.id)
articleToSave = existingPage
} else {
// create new page in elastic
const pageId = await createPage(articleToSave, ctx)
if (!pageId) {
return articleSavingRequestError(
{
errorCodes: [CreateArticleErrorCode.ElasticError],
},
ctx,
articleSavingRequest
)
}
log.info('page created in elastic', articleToSave)
articleToSave.id = pageId
}
// Updating already existing user_article record with the new saved_at date and the new article record
// or creating the new one.
const userArticle = await authTrx<Promise<UserArticleData>>((tx) =>
matchedUserArticleRecord
? models.userArticle.update(
matchedUserArticleRecord.id,
{
slug: slug,
savedAt: saveTime,
articleId: articleRecord.id,
archivedAt: archive ? saveTime : null,
articleUrl: uploadFileUrlOverride || articleRecord.url,
articleHash: articleRecord.hash,
},
tx
)
: models.userArticle.create(
{
userId: uid,
slug: slug,
savedAt: saveTime,
articleId: articleRecord.id,
archivedAt: archive ? saveTime : null,
articleUrl: uploadFileUrlOverride || articleRecord.url,
articleHash: articleRecord.hash,
},
tx
)
)
const createdArticle = { ...userArticle, ...articleRecord }
const createdArticle: PartialArticle = {
...articleToSave,
isArchived: !!articleToSave.archivedAt,
}
return articleSavingRequestPopulate(
{
user,
@ -466,7 +363,7 @@ export const createArticleResolver = authorized<
},
ctx,
articleSavingRequest?.id,
createdArticle.articleId || undefined
createdArticle.id || undefined
)
} catch (error) {
if (
@ -493,7 +390,7 @@ export const getArticleResolver: ResolverFn<
Record<string, unknown>,
WithDataSourcesContext,
QueryArticleArgs
> = async (_obj, { slug }, { claims, models }) => {
> = async (_obj, { slug }, { claims }) => {
try {
if (!claims?.uid) {
return { errorCodes: [ArticleErrorCode.Unauthorized] }
@ -509,16 +406,19 @@ export const getArticleResolver: ResolverFn<
})
await createIntercomEvent('get-article', claims.uid)
const article = await models.userArticle.getByUserIdAndSlug(
claims.uid,
slug
)
console.log('start to get article', Date.now())
if (!article) {
const page = await getPageByParam({ userId: claims.uid, slug })
console.log('get article from elastic', Date.now())
if (!page) {
return { errorCodes: [ArticleErrorCode.NotFound] }
}
return { article: article }
return {
article: { ...page, isArchived: !!page.archivedAt, linkId: page.id },
}
} catch (error) {
return { errorCodes: [ArticleErrorCode.BadData] }
}
@ -533,11 +433,13 @@ export const getArticlesResolver = authorized<
PaginatedPartialArticles,
ArticlesError,
QueryArticlesArgs
>(async (_obj, params, { models, claims, authTrx }) => {
>(async (_obj, params, { claims }) => {
const notNullField = params.sharedOnly ? 'sharedAt' : null
const startCursor = params.after || ''
const first = params.first || 10
console.log('getArticlesResolver starts', Date.now())
// Perform basic sanitization. Right now we just allow alphanumeric, space and quote
// so queries can contain phrases like "human race";
// We can also split out terms like "label:unread".
@ -552,33 +454,36 @@ export const getArticlesResolver = authorized<
readFilter: searchQuery.readFilter,
typeFilter: searchQuery.typeFilter,
labelFilters: searchQuery.labelFilters,
sortParams: searchQuery.sortParams,
env: env.server.apiEnv,
},
})
console.log('parsed search query', Date.now())
await createIntercomEvent('search', claims.uid)
const [userArticles, totalCount] = (await authTrx((tx) =>
models.userArticle.getPaginated(
{
cursor: startCursor,
first: first + 1, // fetch one more item to get next cursor
sort: params.sort || undefined,
query: searchQuery.query,
inFilter: searchQuery.inFilter,
readFilter: searchQuery.readFilter,
typeFilter: searchQuery.typeFilter,
labelFilters: searchQuery.labelFilters,
},
claims.uid,
tx,
notNullField
)
const [pages, totalCount] = (await searchPages(
{
from: Number(startCursor),
size: first + 1, // fetch one more item to get next cursor
sort: searchQuery.sortParams || params.sort || undefined,
query: searchQuery.query,
inFilter: searchQuery.inFilter,
readFilter: searchQuery.readFilter,
typeFilter: searchQuery.typeFilter,
labelFilters: searchQuery.labelFilters,
},
claims.uid,
notNullField
)) || [[], 0]
const start =
startCursor && !isNaN(Number(startCursor)) ? Number(startCursor) : 0
const hasNextPage = userArticles.length > first
const endCursor = String(start + userArticles.length - (hasNextPage ? 1 : 0))
const hasNextPage = pages.length > first
const endCursor = String(start + pages.length - (hasNextPage ? 1 : 0))
console.log('get search result', Date.now())
console.log(
'start',
@ -586,18 +491,22 @@ export const getArticlesResolver = authorized<
'returning end cursor',
endCursor,
'length',
userArticles.length - 1
pages.length - 1
)
//TODO: refactor so that the lastCursor included
if (hasNextPage) {
// remove an extra if exists
userArticles.pop()
pages.pop()
}
const edges = userArticles.map((a) => {
const edges = pages.map((a) => {
return {
node: { ...a, image: a.image && createImageProxyUrl(a.image, 88, 88) },
node: {
...a,
image: a.image && createImageProxyUrl(a.image, 88, 88),
isArchived: !!a.archivedAt,
},
cursor: endCursor,
}
})
@ -700,38 +609,29 @@ export const setBookmarkArticleResolver = authorized<
async (
_,
{ input: { articleID, bookmark } },
{ models, authTrx, claims: { uid }, log }
{ models, authTrx, claims: { uid }, log, pubsub }
) => {
const article = await models.article.get(articleID)
const article = await getPageById(articleID)
if (!article) {
return { errorCodes: [SetBookmarkArticleErrorCode.NotFound] }
}
if (!bookmark) {
const { userArticleRemoved, highlightsUnshared } = await authTrx(
async (tx) => {
const userArticleRemoved = await models.userArticle.getForUser(
uid,
article.id,
tx
)
await models.userArticle.deleteWhere(
{ userId: uid, articleId: article.id },
tx
)
const highlightsUnshared =
await models.highlight.unshareAllHighlights(articleID, uid, tx)
return { userArticleRemoved, highlightsUnshared }
}
)
const userArticleRemoved = await getPageByParam({
userId: uid,
_id: articleID,
})
if (!userArticleRemoved) {
return { errorCodes: [SetBookmarkArticleErrorCode.NotFound] }
}
await deletePage(userArticleRemoved.id, { pubsub })
const highlightsUnshared = await authTrx(async (tx) => {
return models.highlight.unshareAllHighlights(articleID, uid, tx)
})
log.info('Article unbookmarked', {
article: Object.assign({}, article, {
content: undefined,
@ -749,25 +649,19 @@ export const setBookmarkArticleResolver = authorized<
return {
bookmarkedArticle: {
...userArticleRemoved,
...article,
isArchived: false,
savedByViewer: false,
postedByViewer: false,
},
}
} else {
try {
const userArticle = await authTrx((tx) => {
return models.userArticle.create(
{
userId: uid,
articleId: article.id,
slug: generateSlug(article.title),
articleUrl: article.url,
articleHash: article.hash,
},
tx
)
})
const userArticle: Partial<Page> = {
userId: uid,
slug: generateSlug(article.title),
}
await updatePage(articleID, userArticle, { pubsub })
log.info('Article bookmarked', {
article: Object.assign({}, article, {
content: undefined,
@ -785,6 +679,7 @@ export const setBookmarkArticleResolver = authorized<
bookmarkedArticle: {
...userArticle,
...article,
isArchived: false,
savedByViewer: true,
postedByViewer: false,
},
@ -808,11 +703,9 @@ export const saveArticleReadingProgressResolver = authorized<
async (
_,
{ input: { id, readingProgressPercent, readingProgressAnchorIndex } },
{ models, authTrx, claims: { uid } }
{ claims: { uid }, pubsub }
) => {
const userArticleRecord = await authTrx((tx) =>
models.userArticle.getByParameters(uid, { articleId: id }, tx)
)
const userArticleRecord = await getPageByParam({ userId: uid, _id: id })
if (!userArticleRecord) {
return { errorCodes: [SaveArticleReadingProgressErrorCode.NotFound] }
@ -830,32 +723,28 @@ export const saveArticleReadingProgressResolver = authorized<
// be greater than the current reading progress.
const shouldUpdate =
readingProgressPercent === 0 ||
userArticleRecord.articleReadingProgress < readingProgressPercent ||
userArticleRecord.articleReadingProgressAnchorIndex <
(userArticleRecord.readingProgressPercent || 0) <
readingProgressPercent ||
(userArticleRecord.readingProgressAnchorIndex || 0) <
readingProgressAnchorIndex
const updatedArticle = Object.assign(await models.article.get(id), {
const updatedArticle = Object.assign(userArticleRecord, {
readingProgressPercent: shouldUpdate
? readingProgressPercent
: userArticleRecord.articleReadingProgress,
: userArticleRecord.readingProgressPercent,
readingProgressAnchorIndex: shouldUpdate
? readingProgressAnchorIndex
: userArticleRecord.articleReadingProgressAnchorIndex,
: userArticleRecord.readingProgressAnchorIndex,
})
shouldUpdate &&
(await authTrx((tx) =>
models.userArticle.update(
userArticleRecord.id,
{
articleReadingProgress: readingProgressPercent,
articleReadingProgressAnchorIndex: readingProgressAnchorIndex,
},
tx
)
))
shouldUpdate && (await updatePage(id, updatedArticle, { pubsub }))
return { updatedArticle: { ...userArticleRecord, ...updatedArticle } }
return {
updatedArticle: {
...updatedArticle,
isArchived: !!updatedArticle.archivedAt,
},
}
}
)
@ -864,7 +753,7 @@ export const getReadingProgressForArticleResolver: ResolverFn<
Article,
WithDataSourcesContext,
Record<string, unknown>
> = async (article, _params, { models, claims }) => {
> = async (article, _params, { claims }) => {
if (!claims?.uid) {
return 0
}
@ -877,8 +766,8 @@ export const getReadingProgressForArticleResolver: ResolverFn<
}
const articleReadingProgress = (
await models.userArticle.getByArticleId(claims.uid, article.id)
)?.articleReadingProgress
await getPageByParam({ userId: claims.uid, _id: article.id })
)?.readingProgressPercent
return articleReadingProgress || 0
}
@ -888,7 +777,7 @@ export const getReadingProgressAnchorIndexForArticleResolver: ResolverFn<
Article,
WithDataSourcesContext,
Record<string, unknown>
> = async (article, _params, { models, claims }) => {
> = async (article, _params, { claims }) => {
if (!claims?.uid) {
return 0
}
@ -901,8 +790,8 @@ export const getReadingProgressAnchorIndexForArticleResolver: ResolverFn<
}
const articleReadingProgressAnchorIndex = (
await models.userArticle.getByArticleId(claims.uid, article.id)
)?.articleReadingProgressAnchorIndex
await getPageByParam({ userId: claims.uid, _id: article.id })
)?.readingProgressAnchorIndex
return articleReadingProgressAnchorIndex || 0
}

View File

@ -37,8 +37,6 @@ import {
getFollowersResolver,
getFollowingResolver,
getMeUserResolver,
getReadingProgressAnchorIndexForArticleResolver,
getReadingProgressForArticleResolver,
getSharedArticleResolver,
getUserFeedArticlesResolver,
getUserPersonalizationResolver,
@ -79,8 +77,7 @@ import {
generateDownloadSignedUrl,
generateUploadFilePathName,
} from '../utils/uploads'
import { Label } from '../entity/label'
import { labelsLoader } from '../services/labels'
import { getPageById, getPageByParam } from '../elastic'
/* eslint-disable @typescript-eslint/naming-convention */
type ResultResolveType = {
@ -294,11 +291,13 @@ export const functionResolvers = {
},
Article: {
async url(article: Article, _: unknown, ctx: WithDataSourcesContext) {
if (article.pageType == PageType.File && ctx.claims) {
const upload = await ctx.models.uploadFile.uploadFileForArticle(
article.id
)
if (!upload || !upload.fileName || !upload.fileName) {
if (
article.pageType == PageType.File &&
ctx.claims &&
article.uploadFileId
) {
const upload = await ctx.models.uploadFile.get(article.uploadFileId)
if (!upload || !upload.fileName) {
return undefined
}
const filePath = generateUploadFilePathName(upload.id, upload.fileName)
@ -316,11 +315,11 @@ export const functionResolvers = {
return article.savedByViewer
}
if (!ctx.claims?.uid) return undefined
const userArticle = await ctx.models.userArticle.getByArticleId(
ctx.claims.uid,
article.id
)
return !!userArticle
const page = await getPageByParam({
userId: ctx.claims.uid,
_id: article.id,
})
return !!page
},
async postedByViewer(
article: { id: string; postedByViewer?: boolean },
@ -331,11 +330,11 @@ export const functionResolvers = {
return article.postedByViewer
}
if (!ctx.claims?.uid) return false
const userArticle = await ctx.models.userArticle.getByArticleId(
ctx.claims.uid,
article.id
)
return !!userArticle?.sharedAt
const page = await getPageByParam({
userId: ctx.claims.uid,
_id: article.id,
})
return !!page?.sharedAt
},
async savedAt(
article: { id: string; savedAt?: Date; createdAt?: Date },
@ -343,13 +342,13 @@ export const functionResolvers = {
ctx: WithDataSourcesContext & { claims: Claims }
) {
if (!ctx.claims?.uid) return new Date()
if (!article.savedAt) return new Date()
if (article.savedAt) return article.savedAt
return (
(
await ctx.models.userArticle.getByArticleId(
ctx.claims.uid,
article.id
)
await getPageByParam({
userId: ctx.claims.uid,
_id: article.id,
})
)?.savedAt ||
article.createdAt ||
new Date()
@ -365,18 +364,22 @@ export const functionResolvers = {
return validatedDate(article.publishedAt)
},
async isArchived(
article: { id: string; isArchived?: boolean | null },
article: {
id: string
isArchived?: boolean | null
archivedAt?: Date | undefined
},
__: unknown,
ctx: WithDataSourcesContext & { claims: Claims }
) {
if ('isArchived' in article) return article.isArchived
if ('archivedAt' in article) return !!article.archivedAt
if (!ctx.claims?.uid) return false
const userArticle = await ctx.models.userArticle.getForUser(
ctx.claims.uid,
article.id,
ctx.kx
)
return userArticle?.isArchived || false
const page = await getPageByParam({
userId: ctx.claims.uid,
_id: article.id,
})
return !!page?.archivedAt || false
},
contentReader(article: { pageType: PageType }) {
return article.pageType === PageType.File
@ -384,28 +387,34 @@ export const functionResolvers = {
: ContentReader.Web
},
async readingProgressPercent(
article: { id: string; articleReadingProgress?: number },
article: { id: string; readingProgressPercent?: number },
_: unknown,
ctx: WithDataSourcesContext & { claims: Claims }
) {
if ('articleReadingProgress' in article) {
return article.articleReadingProgress
if ('readingProgressPercent' in article) {
return article.readingProgressPercent
}
return (
await ctx.models.userArticle.getByArticleId(ctx.claims.uid, article.id)
)?.articleReadingProgress
await getPageByParam({
userId: ctx.claims.uid,
_id: article.id,
})
)?.readingProgressPercent
},
async readingProgressAnchorIndex(
article: { id: string; articleReadingProgressAnchorIndex?: number },
article: { id: string; readingProgressAnchorIndex?: number },
_: unknown,
ctx: WithDataSourcesContext & { claims: Claims }
) {
if ('articleReadingProgressAnchorIndex' in article) {
return article.articleReadingProgressAnchorIndex
if ('readingProgressAnchorIndex' in article) {
return article.readingProgressAnchorIndex
}
return (
await ctx.models.userArticle.getByArticleId(ctx.claims.uid, article.id)
)?.articleReadingProgressAnchorIndex
await getPageByParam({
userId: ctx.claims.uid,
_id: article.id,
})
)?.readingProgressAnchorIndex
},
async highlights(
article: { id: string; userId?: string },
@ -449,31 +458,20 @@ export const functionResolvers = {
ctx.models
)
},
async labels(article: { linkId: string }): Promise<Label[]> {
// retrieve labels for the link
return labelsLoader.load(article.linkId)
},
},
ArticleSavingRequest: {
async article(
request: { userId: string; articleId: string },
__: unknown,
ctx: WithDataSourcesContext
) {
const article = await ctx.models.userArticle.getForUser(
request.userId,
request.articleId
)
return article
async article(request: { userId: string; articleId: string }, __: unknown) {
if (!request.userId || !request.articleId) return undefined
return getPageByParam({
userId: request.userId,
_id: request.articleId,
})
},
},
Highlight: {
async article(
highlight: { articleId: string },
__: unknown,
ctx: WithDataSourcesContext
) {
return ctx.models.article.get(highlight.articleId)
async article(highlight: { articleId: string }, __: unknown) {
return getPageById(highlight.articleId)
},
async user(
highlight: { userId: string },

View File

@ -7,16 +7,16 @@ import {
CreateHighlightError,
CreateHighlightErrorCode,
CreateHighlightSuccess,
MergeHighlightError,
MergeHighlightErrorCode,
MergeHighlightSuccess,
DeleteHighlightError,
DeleteHighlightErrorCode,
DeleteHighlightSuccess,
Highlight,
MergeHighlightError,
MergeHighlightErrorCode,
MergeHighlightSuccess,
MutationCreateHighlightArgs,
MutationMergeHighlightArgs,
MutationDeleteHighlightArgs,
MutationMergeHighlightArgs,
MutationSetShareHighlightArgs,
MutationUpdateHighlightArgs,
SetShareHighlightError,
@ -30,6 +30,7 @@ import {
import { HighlightData } from '../../datalayer/highlight/model'
import { env } from '../../env'
import { analytics } from '../../utils/analytics'
import { getPageById } from '../../elastic'
const highlightDataToHighlight = (highlight: HighlightData): Highlight => ({
...highlight,
@ -47,7 +48,13 @@ export const createHighlightResolver = authorized<
MutationCreateHighlightArgs
>(async (_, { input }, { models, claims, log }) => {
const { articleId } = input
const article = await models.article.get(articleId)
const article = await getPageById(articleId)
if (!article) {
return {
errorCodes: [CreateHighlightErrorCode.NotFound],
}
}
analytics.track({
userId: claims.uid,
@ -58,12 +65,6 @@ export const createHighlightResolver = authorized<
},
})
if (!article.id) {
return {
errorCodes: [CreateHighlightErrorCode.NotFound],
}
}
if (input.annotation && input.annotation.length > 4000) {
return {
errorCodes: [CreateHighlightErrorCode.BadData],
@ -73,7 +74,9 @@ export const createHighlightResolver = authorized<
try {
const highlight = await models.highlight.create({
...input,
articleId: undefined,
userId: claims.uid,
elasticPageId: article.id,
})
log.info('Creating a new highlight', {
@ -86,7 +89,8 @@ export const createHighlightResolver = authorized<
})
return { highlight: highlightDataToHighlight(highlight) }
} catch {
} catch (err) {
log.error('Error creating highlight', err)
return {
errorCodes: [CreateHighlightErrorCode.AlreadyExists],
}
@ -129,6 +133,7 @@ export const mergeHighlightResolver = authorized<
...newHighlightInput,
annotation: mergedAnnotation ? mergedAnnotation.join('\n') : null,
userId: claims.uid,
elasticPageId: newHighlightInput.articleId,
})
})
if (!highlight) {

View File

@ -22,9 +22,8 @@ import { User } from '../../entity/user'
import { Label } from '../../entity/label'
import { getManager, getRepository, ILike } from 'typeorm'
import { setClaims } from '../../entity/utils'
import { Link } from '../../entity/link'
import { LinkLabel } from '../../entity/link_label'
import { labelsLoader } from '../../services/labels'
import { deleteLabelInPages, getPageById, updatePage } from '../../elastic'
import { createPubSubClient } from '../../datalayer/pubsub'
export const labelsResolver = authorized<LabelsSuccess, LabelsError>(
async (_obj, _params, { claims: { uid }, log }) => {
@ -161,6 +160,9 @@ export const deleteLabelResolver = authorized<
}
}
// delete label in elastic pages
await deleteLabelInPages(uid, label.name, { pubsub: createPubSubClient() })
analytics.track({
userId: uid,
event: 'deleteLabel',
@ -185,10 +187,10 @@ export const setLabelsResolver = authorized<
SetLabelsSuccess,
SetLabelsError,
MutationSetLabelsArgs
>(async (_, { input }, { claims: { uid }, log }) => {
>(async (_, { input }, { claims: { uid }, log, pubsub }) => {
log.info('setLabelsResolver')
const { linkId, labelIds } = input
const { linkId: pageId, labelIds } = input
try {
const user = await getRepository(User).findOne(uid)
@ -198,8 +200,8 @@ export const setLabelsResolver = authorized<
}
}
const link = await getRepository(Link).findOne(linkId)
if (!link) {
const page = await getPageById(pageId)
if (!page) {
return {
errorCodes: [SetLabelsErrorCode.NotFound],
}
@ -217,24 +219,20 @@ export const setLabelsResolver = authorized<
}
}
// delete all existing labels of the link
await getManager().transaction(async (t) => {
await t.getRepository(LinkLabel).delete({ link })
// add new labels
await t
.getRepository(LinkLabel)
.save(labels.map((label) => ({ link, label })))
})
// clear cache
labelsLoader.clear(linkId)
// update labels in the page
await updatePage(
pageId,
{
labels,
},
{ pubsub }
)
analytics.track({
userId: uid,
event: 'setLabels',
properties: {
linkId,
pageId,
labelIds,
env: env.server.apiEnv,
},

View File

@ -1,20 +1,20 @@
import { createOrUpdateLinkShareInfo } from '../../datalayer/links/share_info'
import {
UpdateLinkShareInfoError,
UpdateLinkShareInfoSuccess,
UpdateLinkShareInfoErrorCode,
MutationUpdateLinkShareInfoArgs,
ArchiveLinkSuccess,
ArchiveLinkError,
MutationSetLinkArchivedArgs,
ArchiveLinkErrorCode,
ArchiveLinkSuccess,
MutationSetLinkArchivedArgs,
MutationUpdateLinkShareInfoArgs,
UpdateLinkShareInfoError,
UpdateLinkShareInfoErrorCode,
UpdateLinkShareInfoSuccess,
} from '../../generated/graphql'
import { setLinkArchived } from '../../services/archive_link'
import { authorized } from '../../utils/helpers'
import { analytics } from '../../utils/analytics'
import { env } from '../../env'
import { updatePage } from '../../elastic'
export const updateLinkShareInfoResolver = authorized<
UpdateLinkShareInfoSuccess,
@ -64,7 +64,7 @@ export const setLinkArchivedResolver = authorized<
ArchiveLinkSuccess,
ArchiveLinkError,
MutationSetLinkArchivedArgs
>(async (_obj, args, { models, claims, authTrx }) => {
>(async (_obj, args, { claims, pubsub }) => {
console.log('setLinkArchivedResolver', args.input.linkId)
analytics.track({
@ -75,22 +75,14 @@ export const setLinkArchivedResolver = authorized<
},
})
// TEMP: because the old API uses articles instead of Links, we are actually
// getting an article ID here and need to map it to a link ID. When the API
// is updated to use Links instead of Articles this will be removed.
const link = await authTrx((tx) =>
models.userArticle.getByArticleId(claims.uid, args.input.linkId, tx)
)
if (!link?.id) {
return {
message: 'An error occurred',
errorCodes: [ArchiveLinkErrorCode.BadRequest],
}
}
try {
await setLinkArchived(claims.uid, link.id, args.input.archived)
await updatePage(
args.input.linkId,
{
archivedAt: args.input.archived ? new Date() : null,
},
{ pubsub }
)
} catch (e) {
return {
message: 'An error occurred',
@ -99,7 +91,7 @@ export const setLinkArchivedResolver = authorized<
}
return {
linkId: link.id,
linkId: args.input.linkId,
message: 'Link Archived',
}
})

View File

@ -1,10 +1,10 @@
import {
MutationSaveUrlArgs,
MutationSaveFileArgs,
MutationSavePageArgs,
MutationSaveUrlArgs,
SaveError,
SaveErrorCode,
SaveSuccess,
MutationSaveFileArgs,
} from '../../generated/graphql'
import { savePage } from '../../services/save_page'
import { saveUrl } from '../../services/save_url'
@ -28,7 +28,7 @@ export const savePageResolver = authorized<
return { errorCodes: [SaveErrorCode.Unauthorized] }
}
return await savePage(
return savePage(
ctx,
{ userId: user.id, username: user.profile.username },
input

View File

@ -2,9 +2,12 @@
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
/* eslint-disable @typescript-eslint/explicit-module-boundary-types */
import express from 'express'
import { readPushSubscription } from '../../datalayer/pubsub'
import { kx } from '../../datalayer/knex_config'
import ArticleModel from '../../datalayer/article'
import {
createPubSubClient,
readPushSubscription,
} from '../../datalayer/pubsub'
import { getPageByParam, updatePage } from '../../elastic'
import { Page } from '../../elastic/types'
interface UpdateContentMessage {
fileId: string
@ -50,21 +53,21 @@ export function contentServiceRouter() {
return
}
const model = new ArticleModel(kx)
const page = await model.getByUploadFileId(fileId)
const page = await getPageByParam({ uploadFileId: fileId })
if (!page) {
console.log('No upload file found for id:', fileId)
res.status(400).send('Bad Request')
return
}
const result = await model.updateContent(
page.id,
msg.content,
msg.title,
msg.author,
msg.description
)
const pageToUpdate: Partial<Page> = { content: msg.content }
if (msg.title) pageToUpdate.title = msg.title
if (msg.author) pageToUpdate.author = msg.author
if (msg.description) pageToUpdate.description = msg.description
const result = await updatePage(page.id, pageToUpdate, {
pubsub: createPubSubClient(),
})
console.log(
'Updating article text',
page.id,

View File

@ -0,0 +1,46 @@
/* eslint-disable @typescript-eslint/no-misused-promises */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
/* eslint-disable @typescript-eslint/explicit-module-boundary-types */
import express from 'express'
import { readPushSubscription } from '../../datalayer/pubsub'
import { generateUploadSignedUrl, uploadToSignedUrl } from '../../utils/uploads'
import { v4 as uuidv4 } from 'uuid'
export function pageServiceRouter() {
const router = express.Router()
router.post('/upload/:folder', async (req, res) => {
console.log('upload page data req', req.params.folder)
const { message: msgStr, expired } = readPushSubscription(req)
if (!msgStr) {
res.status(400).send('Bad Request')
return
}
if (expired) {
console.log('discarding expired message')
res.status(200).send('Expired')
return
}
try {
const contentType = 'application/json'
const uploadUrl = await generateUploadSignedUrl(
`${req.params.folder}/${new Date().toDateString()}/${uuidv4()}.json`,
contentType
)
await uploadToSignedUrl(
uploadUrl,
Buffer.from(msgStr, 'utf8'),
contentType
)
res.status(200)
} catch (err) {
console.log('upload page data failed', err)
res.status(500).send(err)
}
})
return router
}

View File

@ -14,6 +14,9 @@ import { analytics } from '../../utils/analytics'
import { getNewsletterEmail } from '../../services/newsletters'
import { setClaims } from '../../datalayer/helpers'
import { generateSlug } from '../../utils/helpers'
import { createPage } from '../../elastic'
import { createPubSubClient } from '../../datalayer/pubsub'
import { Page } from '../../elastic/types'
export function pdfAttachmentsRouter() {
const router = express.Router()
@ -141,30 +144,24 @@ export function pdfAttachmentsRouter() {
const uploadFileHash = uploadFileDetails.md5Hash
const pageType = PageType.File
const title = subject || uploadFileData.fileName
const articleToSave = {
const articleToSave: Page = {
url: uploadFileUrlOverride,
pageType: pageType,
hash: uploadFileHash,
uploadFileId: uploadFileId,
title: title,
content: '',
userId: user.id,
slug: generateSlug(title),
id: '',
createdAt: new Date(),
}
const link = await kx.transaction(async (tx) => {
await setClaims(tx, user.id)
const articleRecord = await models.article.create(articleToSave, tx)
return models.userArticle.create(
{
userId: user.id,
slug: generateSlug(title),
articleId: articleRecord.id,
articleUrl: uploadFileUrlOverride,
articleHash: articleRecord.hash,
},
tx
)
const pageId = await createPage(articleToSave, {
pubsub: createPubSubClient(),
})
res.send({ id: link.id })
res.send({ id: pageId })
} catch (err) {
console.log(err)
res.status(500).send(err)

View File

@ -26,6 +26,7 @@ const schema = gql`
enum SortBy {
UPDATED_TIME
SCORE
}
enum ContentReader {
@ -335,6 +336,7 @@ const schema = gql`
isArchived: Boolean!
linkId: ID
labels: [Label!]
uploadFileId: ID
}
# Query: article
@ -436,6 +438,7 @@ const schema = gql`
NOT_ALLOWED_TO_PARSE
PAYLOAD_TOO_LARGE
UPLOAD_FILE_MISSING
ELASTIC_ERROR
}
type CreateArticleError {
errorCodes: [CreateArticleErrorCode!]!
@ -1246,7 +1249,7 @@ const schema = gql`
name: String!
color: String!
description: String
createdAt: Date!
createdAt: Date
}
type LabelsSuccess {

View File

@ -39,6 +39,8 @@ import { remindersServiceRouter } from './routers/svc/reminders'
import { ApolloServer } from 'apollo-server-express'
import { pdfAttachmentsRouter } from './routers/svc/pdf_attachments'
import { corsConfig } from './utils/corsConfig'
import { initElasticsearch } from './elastic'
import { pageServiceRouter } from './routers/svc/pages'
const PORT = process.env.PORT || 4000
@ -98,6 +100,7 @@ export const createApp = (): {
app.use('/svc/pubsub/links', linkServiceRouter())
app.use('/svc/pubsub/newsletters', newsletterServiceRouter())
app.use('/svc/pubsub/emails', emailsServiceRouter())
app.use('/svc/pubsub/pages', pageServiceRouter())
app.use('/svc/reminders', remindersServiceRouter())
app.use('/svc/pdf-attachments', pdfAttachmentsRouter())
@ -124,6 +127,8 @@ const main = async (): Promise<void> => {
// as healthy.
await initEntities()
await initElasticsearch()
const { app, apollo, httpServer } = createApp()
await apollo.start()

View File

@ -2,9 +2,9 @@ import { PubsubClient } from '../datalayer/pubsub'
import { DataModels } from '../resolvers/types'
import { generateSlug, stringToHash, validatedDate } from '../utils/helpers'
import {
parseUrlMetadata,
parseOriginalContent,
parsePreparedContent,
parseUrlMetadata,
} from '../utils/parser'
import normalizeUrl from 'normalize-url'
import { kx } from '../datalayer/knex_config'
@ -65,11 +65,11 @@ export const saveEmail = async (
}
if (parseResult.canonicalUrl && parseResult.domContent) {
await ctx.pubsub.pageSaved(
saverId,
parseResult.canonicalUrl,
parseResult.domContent
)
// await ctx.pubsub.pageSaved(
// saverId,
// parseResult.canonicalUrl,
// parseResult.domContent
// )
}
const matchedUserArticleRecord = await ctx.models.userArticle.getByParameters(
@ -81,7 +81,7 @@ export const saveEmail = async (
let result: UserArticleData | undefined = undefined
if (matchedUserArticleRecord) {
await ctx.pubsub.pageCreated(saverId, url, input.originalContent)
// await ctx.pubsub.pageCreated(saverId, url, input.originalContent)
await kx.transaction(async (tx) => {
await setClaims(tx, saverId)
@ -95,7 +95,7 @@ export const saveEmail = async (
})
console.log('created matched email', result)
} else {
await ctx.pubsub.pageCreated(saverId, url, input.originalContent)
// await ctx.pubsub.pageCreated(saverId, url, input.originalContent)
await kx.transaction(async (tx) => {
await setClaims(tx, saverId)

View File

@ -13,6 +13,7 @@ import { DataModels } from '../resolvers/types'
import { generateSlug } from '../utils/helpers'
import { getStorageFileDetails, makeStorageFilePublic } from '../utils/uploads'
import { createSavingRequest } from './save_page'
import { createPage, getPageByParam, updatePage } from '../elastic'
type SaveContext = {
pubsub: PubsubClient
@ -70,68 +71,60 @@ export const saveFile = async (
uploadFileData.fileName
)
// if (parseResult.canonicalUrl && parseResult.domContent) {
// ctx.pubsub.pageSaved(saver.id, parseResult.canonicalUrl, parseResult.domContent)
// }
const matchedUserArticleRecord = await ctx.authTrx(async (tx) => {
return ctx.models.userArticle.getByParameters(
saver.id,
{
articleUrl: uploadFileUrlOverride,
},
tx
)
const matchedUserArticleRecord = await getPageByParam({
userId: saver.id,
url: uploadFileUrlOverride,
})
if (matchedUserArticleRecord) {
// ctx.pubsub.pageCreated(saver.id, input.url, input.originalContent)
await updatePage(
matchedUserArticleRecord.id,
{
savedAt: new Date(),
archivedAt: null,
},
ctx
)
await ctx.authTrx(async (tx) => {
await ctx.models.userArticle.update(
matchedUserArticleRecord.id,
{
savedAt: new Date(),
archivedAt: null,
},
tx
)
await ctx.models.articleSavingRequest.update(
savingRequest.id,
{
articleId: matchedUserArticleRecord.articleId,
elasticPageId: matchedUserArticleRecord.id,
status: ArticleSavingRequestStatus.Succeeded,
},
tx
)
})
} else {
const pageId = await createPage(
{
url: uploadFileUrlOverride,
title: uploadFile.fileName,
hash: uploadFileDetails.md5Hash,
content: '',
pageType: PageType.File,
uploadFileId: input.uploadFileId,
slug: generateSlug(uploadFile.fileName),
userId: saver.id,
id: '',
createdAt: new Date(),
},
ctx
)
if (!pageId) {
console.log('error creating page in elastic', input)
return {
errorCodes: [SaveErrorCode.Unknown],
}
}
await ctx.authTrx(async (tx) => {
const article = await ctx.models.article.create(
{
url: uploadFileUrlOverride,
title: uploadFile.fileName,
hash: uploadFileDetails.md5Hash,
content: '',
pageType: PageType.File,
uploadFileId: input.uploadFileId,
},
tx
)
await ctx.models.userArticle.create(
{
slug: generateSlug(uploadFile.fileName),
userId: saver.id,
articleId: article.id,
articleUrl: article.url,
articleHash: article.hash,
},
tx
)
await ctx.models.articleSavingRequest.update(
savingRequest.id,
{
articleId: article.id,
elasticPageId: pageId,
status: ArticleSavingRequestStatus.Succeeded,
},
tx

View File

@ -1,7 +1,5 @@
import { MulticastMessage } from 'firebase-admin/messaging'
import { ArticleData } from '../datalayer/article/model'
import { kx } from '../datalayer/knex_config'
import { UserArticleData } from '../datalayer/links/model'
import { createPubSubClient } from '../datalayer/pubsub'
import { UserDeviceToken } from '../entity/user_device_tokens'
import { env } from '../env'
@ -12,6 +10,8 @@ import { sendMulticastPushNotifications } from '../utils/sendNotification'
import { getNewsletterEmail } from './newsletters'
import { SaveContext, saveEmail, SaveEmailInput } from './save_email'
import { getDeviceTokensByUserId } from './user_device_tokens'
import { getPageByParam } from '../elastic'
import { Page } from '../elastic/types'
interface NewsletterMessage {
email: string
@ -70,10 +70,10 @@ export const saveNewsletterEmail = async (
return true
}
const link = await ctx.models.userArticle.getForUser(
newsletterEmail.user.id,
result.articleId
)
const link = await getPageByParam({
_id: result.articleId,
userId: newsletterEmail.user.id,
})
if (!link) {
console.log(
@ -97,7 +97,7 @@ export const saveNewsletterEmail = async (
}
const messageForLink = (
link: ArticleData & UserArticleData,
link: Page,
deviceTokens: UserDeviceToken[]
): MulticastMessage => {
let title = '📫 - An article was added to your Omnivore Inbox'
@ -117,10 +117,10 @@ const messageForLink = (
title: link.title,
image: link.image,
author: link.author,
isArchived: link.isArchived,
isArchived: !!link.archivedAt,
contentReader: ContentReader.Web,
readingProgressPercent: link.articleReadingProgress,
readingProgressAnchorIndex: link.articleReadingProgressAnchorIndex,
readingProgressPercent: link.readingProgressPercent,
readingProgressAnchorIndex: link.readingProgressAnchorIndex,
})
).toString('base64'),
}

View File

@ -8,12 +8,14 @@ import {
} from '../generated/graphql'
import { DataModels } from '../resolvers/types'
import { generateSlug, stringToHash, validatedDate } from '../utils/helpers'
import { parsePreparedContent, parseOriginalContent } from '../utils/parser'
import { parseOriginalContent, parsePreparedContent } from '../utils/parser'
import normalizeUrl from 'normalize-url'
import { createPageSaveRequest } from './create_page_save_request'
import { kx } from '../datalayer/knex_config'
import { setClaims } from '../datalayer/helpers'
import { createPage, getPageByParam, updatePage } from '../elastic'
import { Page } from '../elastic/types'
type SaveContext = {
pubsub: PubsubClient
@ -85,7 +87,10 @@ export const savePage = async (
const pageType = parseOriginalContent(input.url, input.originalContent)
const articleToSave = {
const articleToSave: Page = {
id: '',
slug,
userId: saver.userId,
originalHtml: parseResult.domContent,
content: parseResult.parsedContent?.content || '',
description: parseResult.parsedContent?.excerpt,
@ -99,41 +104,29 @@ export const savePage = async (
hash: stringToHash(parseResult.parsedContent?.content || input.url),
image: parseResult.parsedContent?.previewImage,
publishedAt: validatedDate(parseResult.parsedContent?.publishedDate),
createdAt: new Date(),
}
if (parseResult.canonicalUrl && parseResult.domContent) {
await ctx.pubsub.pageSaved(
saver.userId,
parseResult.canonicalUrl,
parseResult.domContent
const existingPage = await getPageByParam({
userId: saver.userId,
url: articleToSave.url,
})
if (existingPage) {
await updatePage(
existingPage.id,
{
savedAt: new Date(),
archivedAt: undefined,
},
ctx
)
}
const matchedUserArticleRecord = await ctx.models.userArticle.getByParameters(
saver.userId,
{
articleUrl: articleToSave.url,
}
)
if (matchedUserArticleRecord) {
await ctx.pubsub.pageCreated(saver.userId, input.url, input.originalContent)
await kx.transaction(async (tx) => {
await setClaims(tx, saver.userId)
await ctx.models.userArticle.update(
matchedUserArticleRecord.id,
{
savedAt: new Date(),
archivedAt: null,
},
tx
)
await ctx.models.articleSavingRequest.update(
savingRequest.id,
{
articleId: matchedUserArticleRecord.articleId,
status: ArticleSavingRequestStatus.Succeeded,
elasticPageId: existingPage.id,
},
tx
)
@ -141,27 +134,15 @@ export const savePage = async (
} else if (shouldParseInBackend(input)) {
await createPageSaveRequest(saver.userId, input.url, ctx.models)
} else {
await ctx.pubsub.pageCreated(saver.userId, input.url, input.originalContent)
const pageId = await createPage(articleToSave, ctx)
await kx.transaction(async (tx) => {
await setClaims(tx, saver.userId)
const article = await ctx.models.article.create(articleToSave, tx)
await ctx.models.userArticle.create(
{
slug: slug,
userId: saver.userId,
articleId: article.id,
articleUrl: article.url,
articleHash: article.hash,
},
tx
)
await ctx.models.articleSavingRequest.update(
savingRequest.id,
{
articleId: article.id,
status: ArticleSavingRequestStatus.Succeeded,
elasticPageId: pageId,
},
tx
)

View File

@ -66,6 +66,11 @@ interface BackendEnv {
gcsUploadBucket: string
gcsUploadSAKeyFilePath: string
}
elastic: {
url: string
username: string
password: string
}
}
/***
@ -103,6 +108,8 @@ const nullableEnvVars = [
'GAUTH_SECRET',
'SEGMENT_WRITE_KEY',
'TWITTER_BEARER_TOKEN',
'ELASTIC_USERNAME',
'ELASTIC_PASSWORD',
] // Allow some vars to be null/empty
/* If not in GAE and Prod/QA/Demo env (f.e. on localhost/dev env), allow following env vars to be null */
@ -196,6 +203,11 @@ export function getEnv(): BackendEnv {
gcsUploadBucket: parse('GCS_UPLOAD_BUCKET'),
gcsUploadSAKeyFilePath: parse('GCS_UPLOAD_SA_KEY_FILE_PATH'),
}
const elastic = {
url: parse('ELASTIC_URL'),
username: parse('ELASTIC_USERNAME'),
password: parse('ELASTIC_PASSWORD'),
}
return {
pg,
@ -211,6 +223,7 @@ export function getEnv(): BackendEnv {
dev,
fileUpload,
queue,
elastic,
}
}

View File

@ -206,7 +206,10 @@ export const articleSavingRequestPopulate = async (
await ctx.authTrx((tx) =>
ctx.models.articleSavingRequest.update(
articleSavingReqestId,
{ status: ArticleSavingRequestStatus.Succeeded, articleId: articleId },
{
status: ArticleSavingRequestStatus.Succeeded,
elasticPageId: articleId,
},
tx
)
)

View File

@ -4,11 +4,12 @@
/* eslint-disable @typescript-eslint/no-unsafe-member-access */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
import {
ISearchParserDictionary,
parse,
SearchParserKeyWordOffset,
SearchParserTextOffset,
} from 'search-query-parser'
import { PageType } from '../generated/graphql'
import { PageType, SortBy, SortOrder, SortParams } from '../generated/graphql'
export enum ReadFilter {
ALL,
@ -27,7 +28,18 @@ export type SearchFilter = {
inFilter: InFilter
readFilter: ReadFilter
typeFilter?: PageType | undefined
labelFilters?: string[]
labelFilters: LabelFilter[]
sortParams?: SortParams
}
export enum LabelFilterType {
INCLUDE,
EXCLUDE,
}
export type LabelFilter = {
type: LabelFilterType
labels: string[]
}
const parseIsFilter = (str: string | undefined): ReadFilter => {
@ -78,18 +90,50 @@ const parseTypeFilter = (str: string | undefined): PageType | undefined => {
return undefined
}
const parseLabelFilters = (
str: string | undefined,
labelFilters: string[] | undefined
): string[] | undefined => {
const parseLabelFilter = (
str?: string,
exclude?: ISearchParserDictionary
): LabelFilter | undefined => {
if (str === undefined) {
return labelFilters
return undefined
}
// use lower case for label names
const label = str.toLowerCase()
const labels = str.toLocaleLowerCase().split(',')
return labelFilters ? labelFilters.concat(label) : [label]
// check if the labels are in the exclude list
const excluded =
exclude &&
exclude.label &&
labels.every((label) => exclude.label.includes(label))
return {
type: excluded ? LabelFilterType.EXCLUDE : LabelFilterType.INCLUDE,
labels,
}
}
const parseSortParams = (str?: string): SortParams | undefined => {
if (str === undefined) {
return undefined
}
const [sort, order] = str.split(':')
const sortOrder =
order?.toUpperCase() === 'ASC' ? SortOrder.Ascending : SortOrder.Descending
switch (sort.toUpperCase()) {
case 'UPDATED_AT':
return {
by: SortBy.UpdatedTime,
order: sortOrder,
}
case 'SCORE':
return {
by: SortBy.Score,
order: sortOrder,
}
}
}
export const parseSearchQuery = (query: string | undefined): SearchFilter => {
@ -98,6 +142,7 @@ export const parseSearchQuery = (query: string | undefined): SearchFilter => {
query: searchQuery,
readFilter: ReadFilter.ALL,
inFilter: searchQuery ? InFilter.ALL : InFilter.INBOX,
labelFilters: [],
}
if (!searchQuery) {
@ -105,11 +150,12 @@ export const parseSearchQuery = (query: string | undefined): SearchFilter => {
query: undefined,
inFilter: InFilter.INBOX,
readFilter: ReadFilter.ALL,
labelFilters: [],
}
}
const parsed = parse(searchQuery, {
keywords: ['in', 'is', 'type', 'label'],
keywords: ['in', 'is', 'type', 'label', 'sort'],
tokenize: true,
})
if (parsed.offsets) {
@ -148,12 +194,15 @@ export const parseSearchQuery = (query: string | undefined): SearchFilter => {
case 'type':
result.typeFilter = parseTypeFilter(keyword.value)
break
case 'label':
result.labelFilters = parseLabelFilters(
keyword.value,
result.labelFilters
)
case 'label': {
const labelFilter = parseLabelFilter(keyword.value, parsed.exclude)
labelFilter && result.labelFilters.push(labelFilter)
break
}
case 'sort': {
result.sortParams = parseSortParams(keyword.value)
break
}
}
}
}

View File

@ -2,6 +2,7 @@
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
import { env } from '../env'
import { GetSignedUrlConfig, Storage } from '@google-cloud/storage'
import axios, { AxiosResponse } from 'axios'
/* On GAE/Prod, we shall rely on default app engine service account credentials.
* Two changes needed: 1) add default service account to our uploads GCS Bucket
@ -95,3 +96,17 @@ export const generateUploadFilePathName = (
): string => {
return `u/${id}/${fileName}`
}
export const uploadToSignedUrl = async (
uploadUrl: string,
data: Buffer,
contentType: string
): Promise<AxiosResponse> => {
return axios.put(uploadUrl, data, {
headers: {
'Content-Type': contentType,
},
maxBodyLength: 1000000000,
maxContentLength: 100000000,
})
}

View File

@ -0,0 +1,143 @@
import 'mocha'
import {
createPage,
deletePage,
getPageById,
getPageByParam,
searchPages,
updatePage,
} from '../../src/elastic'
import { PageType } from '../../src/generated/graphql'
import { expect } from 'chai'
import { InFilter, ReadFilter } from '../../src/utils/search'
import { Page, PageContext } from '../../src/elastic/types'
import { createPubSubClient } from '../../src/datalayer/pubsub'
describe('elastic api', () => {
const ctx: PageContext = { pubsub: createPubSubClient(), refresh: true }
let page: Page
before(async () => {
// create a testing page
page = {
id: '',
hash: 'test hash',
userId: 'test userId',
pageType: PageType.Article,
title: 'test title',
content: '<p>test</p>',
slug: 'test slug',
createdAt: new Date(),
updatedAt: new Date(),
readingProgressPercent: 100,
readingProgressAnchorIndex: 0,
url: 'https://blog.omnivore.app/p/getting-started-with-omnivore',
archivedAt: new Date(),
labels: [
{
id: 'Test label id',
name: 'test label',
color: '#ffffff',
createdAt: new Date(),
},
{
id: 'Test label id 2',
name: 'test label 2',
color: '#eeeeee',
createdAt: new Date(),
},
],
}
const pageId = await createPage(page, ctx)
if (!pageId) {
expect.fail('Failed to create page')
}
page.id = pageId
})
after(async () => {
// delete the testing page
await deletePage(page.id, ctx)
})
describe('createPage', () => {
let newPageId: string | undefined
after(async () => {
if (newPageId) {
await deletePage(newPageId, ctx)
}
})
it('creates a page', async () => {
const newPageData: Page = {
id: '',
hash: 'hash',
userId: 'userId',
pageType: PageType.Article,
title: 'test',
content: 'test',
slug: 'test',
createdAt: new Date(),
updatedAt: new Date(),
readingProgressPercent: 0,
readingProgressAnchorIndex: 0,
url: 'https://blog.omnivore.app/testUrl',
}
newPageId = await createPage(newPageData, ctx)
expect(newPageId).to.be.a('string')
})
})
describe('getPageByParam', () => {
it('gets a page by url', async () => {
const pageFound = await getPageByParam({
userId: page.userId,
url: page.url,
})
expect(pageFound).not.undefined
})
})
describe('getPageById', () => {
it('gets a page by id', async () => {
const pageFound = await getPageById(page.id)
expect(pageFound).not.undefined
})
})
describe('updatePage', () => {
it('updates a page', async () => {
const newTitle = 'new title'
const updatedPageData: Partial<Page> = {
title: newTitle,
}
await updatePage(page.id, updatedPageData, ctx)
const updatedPage = await getPageById(page.id)
expect(updatedPage?.title).to.eql(newTitle)
})
})
describe('searchPages', () => {
it('searches pages', async () => {
const searchResults = await searchPages(
{
inFilter: InFilter.ALL,
labelFilters: [],
readFilter: ReadFilter.ALL,
query: 'test',
},
page.userId
)
expect(searchResults).not.undefined
})
})
})

View File

@ -1,6 +1,10 @@
import { createTestConnection } from './db'
import { initElasticsearch } from '../src/elastic'
export const mochaGlobalSetup = async () => {
await createTestConnection()
console.log('db connection created')
await initElasticsearch()
console.log('elasticsearch initialized')
}

View File

@ -1,23 +1,31 @@
import { createTestLabel, createTestUser, deleteTestUser } from '../db'
import {
createTestLabel,
createTestLink,
createTestPage,
createTestUser,
deleteTestUser,
} from '../db'
import { generateFakeUuid, graphqlRequest, request } from '../util'
createTestElasticPage,
generateFakeUuid,
graphqlRequest,
request,
} from '../util'
import * as chai from 'chai'
import { expect } from 'chai'
import { Link } from '../../src/entity/link'
import 'mocha'
import { User } from '../../src/entity/user'
import chaiString from 'chai-string'
import { getRepository } from 'typeorm'
import { LinkLabel } from '../../src/entity/link_label'
import { Label } from '../../src/entity/label'
import {
createPage,
deletePage,
getPageById,
updatePage,
} from '../../src/elastic'
import { PageType, UploadFileStatus } from '../../src/generated/graphql'
import { Page, PageContext } from '../../src/elastic/types'
import { getRepository } from 'typeorm'
import { UploadFile } from '../../src/entity/upload_file'
import { createPubSubClient } from '../../src/datalayer/pubsub'
chai.use(chaiString)
const ctx: PageContext = { pubsub: createPubSubClient(), refresh: true }
const archiveLink = async (authToken: string, linkId: string) => {
const query = `
mutation {
@ -39,6 +47,45 @@ const archiveLink = async (authToken: string, linkId: string) => {
return graphqlRequest(query, authToken).expect(200)
}
const createArticleQuery = (
url: string,
source: string,
document: string,
title: string
) => {
return `
mutation {
createArticle(input: {
url: "${url}"
source: "${source}"
preparedDocument: {
document: "${document}"
pageInfo: {
contentType: "text/html"
title: "${title}"
}
}
}) {
... on CreateArticleSuccess {
createdArticle {
id
title
content
}
user {
id
name
}
created
}
... on CreateArticleError {
errorCodes
}
}
}
`
}
const articlesQuery = (after = '', order = 'ASCENDING') => {
return `
query {
@ -81,6 +128,24 @@ const articlesQuery = (after = '', order = 'ASCENDING') => {
`
}
const getArticleQuery = (slug: string) => {
return `
query {
article(slug: "${slug}", username: "") {
... on ArticleSuccess {
article {
id
slug
}
}
... on ArticleError {
errorCodes
}
}
}
`
}
const savePageQuery = (url: string, title: string, originalContent: string) => {
return `
mutation {
@ -104,12 +169,81 @@ const savePageQuery = (url: string, title: string, originalContent: string) => {
`
}
const saveFileQuery = (url: string, uploadFileId: string) => {
return `
mutation {
saveFile (
input: {
url: "${url}",
source: "test",
clientRequestId: "${generateFakeUuid()}",
uploadFileId: "${uploadFileId}",
}
) {
... on SaveSuccess {
url
}
... on SaveError {
errorCodes
}
}
}
`
}
const setBookmarkQuery = (articleId: string, bookmark: boolean) => {
return `
mutation {
setBookmarkArticle(
input: {
articleID: "${articleId}",
bookmark: ${bookmark}
}
) {
... on SetBookmarkArticleSuccess {
bookmarkedArticle {
id
}
}
... on SetBookmarkArticleError {
errorCodes
}
}
}
`
}
const saveArticleReadingProgressQuery = (
articleId: string,
progress: number
) => {
return `
mutation {
saveArticleReadingProgress(
input: {
id: "${articleId}",
readingProgressPercent: ${progress}
readingProgressAnchorIndex: 0
}
) {
... on SaveArticleReadingProgressSuccess {
updatedArticle {
id
readingProgressPercent
}
}
... on SaveArticleReadingProgressError {
errorCodes
}
}
}
`
}
describe('Article API', () => {
const username = 'fakeUser'
let authToken: string
let user: User
let links: Link[] = []
let label: Label
before(async () => {
// create test user and login
@ -118,20 +252,6 @@ describe('Article API', () => {
.post('/local/debug/fake-user-login')
.send({ fakeEmail: user.email })
// Create some test links
for (let i = 0; i < 15; i++) {
const page = await createTestPage()
const link = await createTestLink(user, page)
links.push(link)
}
// create testing labels
label = await createTestLabel(user, 'label', '#ffffff')
// set label to a link
await getRepository(LinkLabel).save({
link: links[0],
label: label,
})
authToken = res.body.authToken
})
@ -140,20 +260,147 @@ describe('Article API', () => {
await deleteTestUser(username)
})
describe('CreateArticle', () => {
let query = ''
let url = ''
let source = ''
let document = ''
let title = ''
let pageId = ''
beforeEach(async () => {
query = createArticleQuery(url, source, document, title)
})
context('when saving from document', () => {
before(() => {
url = 'https://blog.omnivore.app/p/testing-is-fun-with-omnivore'
source = 'puppeteer-parse'
document = '<p>test</p>'
title = 'new title'
})
after(async () => {
await deletePage(pageId, ctx)
})
it('should create an article', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.createArticle.createdArticle.title).to.eql(title)
pageId = res.body.data.createArticle.createdArticle.id
})
})
})
describe('GetArticle', () => {
const realSlug = 'testing-is-really-fun-with-omnivore'
let query = ''
let slug = ''
let pageId: string | undefined
before(async () => {
const page = {
id: '',
hash: 'test hash',
userId: user.id,
pageType: PageType.Article,
title: 'test title',
content: '<p>test</p>',
slug: realSlug,
createdAt: new Date(),
updatedAt: new Date(),
readingProgressPercent: 100,
readingProgressAnchorIndex: 0,
url: 'https://blog.omnivore.app/test-with-omnivore',
savedAt: new Date(),
} as Page
pageId = await createPage(page, ctx)
})
after(async () => {
if (pageId) {
await deletePage(pageId, ctx)
}
})
beforeEach(async () => {
query = getArticleQuery(slug)
})
context('when article exists', () => {
before(() => {
slug = realSlug
})
it('should return the article', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.article.article.slug).to.eql(slug)
})
})
context('when article does not exist', () => {
before(() => {
slug = 'not-a-real-slug'
})
it('should return an error', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.article.errorCodes).to.eql(['NOT_FOUND'])
})
})
})
describe('GetArticles', () => {
let query = ''
let after = ''
let pages: Page[] = []
let label: Label
before(async () => {
// Create some test pages
for (let i = 0; i < 15; i++) {
const page = {
id: '',
hash: 'test hash',
userId: user.id,
pageType: PageType.Article,
title: 'test title',
content: '<p>test</p>',
slug: 'test slug',
createdAt: new Date(),
updatedAt: new Date(),
readingProgressPercent: 100,
readingProgressAnchorIndex: 0,
url: 'https://blog.omnivore.app/p/getting-started-with-omnivore',
savedAt: new Date(),
} as Page
const pageId = await createPage(page, ctx)
if (!pageId) {
expect.fail('Failed to create page')
}
page.id = pageId
pages.push(page)
}
// create testing labels
label = await createTestLabel(user, 'label', '#ffffff')
// set label to a link
await updatePage(
pages[0].id,
{
...pages[0],
labels: [{ id: label.id, name: label.name, color: label.color }],
},
ctx
)
})
beforeEach(async () => {
query = articlesQuery(after)
})
it('should return linkId', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.articles.edges[0].node.linkId).to.eql(links[0].id)
})
it('should return labels', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
@ -166,11 +413,11 @@ describe('Article API', () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.articles.edges.length).to.eql(5)
expect(res.body.data.articles.edges[0].node.id).to.eql(links[0].page.id)
expect(res.body.data.articles.edges[1].node.id).to.eql(links[1].page.id)
expect(res.body.data.articles.edges[2].node.id).to.eql(links[2].page.id)
expect(res.body.data.articles.edges[3].node.id).to.eql(links[3].page.id)
expect(res.body.data.articles.edges[4].node.id).to.eql(links[4].page.id)
expect(res.body.data.articles.edges[0].node.id).to.eql(pages[0].id)
expect(res.body.data.articles.edges[1].node.id).to.eql(pages[1].id)
expect(res.body.data.articles.edges[2].node.id).to.eql(pages[2].id)
expect(res.body.data.articles.edges[3].node.id).to.eql(pages[3].id)
expect(res.body.data.articles.edges[4].node.id).to.eql(pages[4].id)
})
it('should set the pageInfo', async () => {
@ -197,11 +444,11 @@ describe('Article API', () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.articles.edges.length).to.eql(5)
expect(res.body.data.articles.edges[0].node.id).to.eql(links[5].page.id)
expect(res.body.data.articles.edges[1].node.id).to.eql(links[6].page.id)
expect(res.body.data.articles.edges[2].node.id).to.eql(links[7].page.id)
expect(res.body.data.articles.edges[3].node.id).to.eql(links[8].page.id)
expect(res.body.data.articles.edges[4].node.id).to.eql(links[9].page.id)
expect(res.body.data.articles.edges[0].node.id).to.eql(pages[5].id)
expect(res.body.data.articles.edges[1].node.id).to.eql(pages[6].id)
expect(res.body.data.articles.edges[2].node.id).to.eql(pages[7].id)
expect(res.body.data.articles.edges[3].node.id).to.eql(pages[8].id)
expect(res.body.data.articles.edges[4].node.id).to.eql(pages[9].id)
})
it('should set the pageInfo', async () => {
@ -253,31 +500,203 @@ describe('Article API', () => {
authToken
).expect(200)
let allLinks
// Save a link, then archive it
let allLinks = await graphqlRequest(
articlesQuery('', 'DESCENDING'),
authToken
).expect(200)
const justSavedId = allLinks.body.data.articles.edges[0].node.id
await archiveLink(authToken, justSavedId)
// set a slight delay to make sure the page is updated
setTimeout(async () => {
let allLinks = await graphqlRequest(
articlesQuery('', 'DESCENDING'),
authToken
).expect(200)
const justSavedId = allLinks.body.data.articles.edges[0].node.id
await archiveLink(authToken, justSavedId)
}, 100)
// test the negative case, ensuring the archive link wasn't returned
allLinks = await graphqlRequest(
articlesQuery('', 'DESCENDING'),
authToken
).expect(200)
expect(allLinks.body.data.articles.edges[0].node.url).to.not.eq(url)
setTimeout(async () => {
allLinks = await graphqlRequest(
articlesQuery('', 'DESCENDING'),
authToken
).expect(200)
expect(allLinks.body.data.articles.edges[0].node.url).to.not.eq(url)
}, 100)
// Now save the link again, and ensure it is returned
const resaved = await graphqlRequest(
await graphqlRequest(
savePageQuery(url, title, originalContent),
authToken
).expect(200)
allLinks = await graphqlRequest(
articlesQuery('', 'DESCENDING'),
authToken
).expect(200)
expect(allLinks.body.data.articles.edges[0].node.url).to.eq(url)
setTimeout(async () => {
allLinks = await graphqlRequest(
articlesQuery('', 'DESCENDING'),
authToken
).expect(200)
expect(allLinks.body.data.articles.edges[0].node.url).to.eq(url)
}, 100)
})
})
})
describe('setBookmarkArticle', () => {
let query = ''
let articleId = ''
let bookmark = true
let pageId = ''
before(async () => {
const page: Page = {
id: '',
hash: 'test hash',
userId: user.id,
pageType: PageType.Article,
title: 'test title',
content: '<p>test</p>',
createdAt: new Date(),
url: 'https://blog.omnivore.app/setBookmarkArticle',
slug: 'test-with-omnivore',
}
const newPageId = await createPage(page, ctx)
if (newPageId) {
pageId = newPageId
}
})
after(async () => {
if (pageId) {
await deletePage(pageId, ctx)
}
})
beforeEach(() => {
query = setBookmarkQuery(articleId, bookmark)
})
context('when we set a bookmark on an article', () => {
before(async () => {
articleId = pageId
bookmark = true
})
it('should bookmark an article', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.setBookmarkArticle.bookmarkedArticle.id).to.eq(
articleId
)
})
})
context('when we unset a bookmark on an article', () => {
before(async () => {
articleId = pageId
bookmark = false
})
it('should delete an article', async () => {
await graphqlRequest(query, authToken).expect(200)
const pageId = await getPageById(articleId)
expect(pageId).to.undefined
})
})
})
describe('saveArticleReadingProgressResolver', () => {
let query = ''
let articleId = ''
let progress = 0.5
let pageId = ''
before(async () => {
pageId = (await createTestElasticPage(user)).id
})
after(async () => {
if (pageId) {
await deletePage(pageId, ctx)
}
})
beforeEach(() => {
query = saveArticleReadingProgressQuery(articleId, progress)
})
context('when we save a reading progress on an article', () => {
before(async () => {
articleId = pageId
progress = 0.5
})
it('should save a reading progress on an article', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(
res.body.data.saveArticleReadingProgress.updatedArticle
.readingProgressPercent
).to.eq(progress)
})
it('should not allow setting the reading progress lower than current progress', async () => {
const firstQuery = saveArticleReadingProgressQuery(articleId, 75)
const firstRes = await graphqlRequest(firstQuery, authToken).expect(200)
expect(
firstRes.body.data.saveArticleReadingProgress.updatedArticle
.readingProgressPercent
).to.eq(75)
// Now try to set to a lower value (50), value should not be updated
// have a slight delay to ensure the reading progress is updated
setTimeout(async () => {
const secondQuery = saveArticleReadingProgressQuery(articleId, 50)
const secondRes = await graphqlRequest(secondQuery, authToken).expect(
200
)
expect(
secondRes.body.data.saveArticleReadingProgress.updatedArticle
.readingProgressPercent
).to.eq(75)
}, 100)
})
})
})
describe('SaveFile', () => {
let query = ''
let url = ''
let uploadFileId = ''
beforeEach(() => {
query = saveFileQuery(url, uploadFileId)
})
context('when the file is not uploaded', () => {
before(async () => {
url = 'fake url'
uploadFileId = generateFakeUuid()
})
it('should return Unauthorized error', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.saveFile.errorCodes).to.eql(['UNAUTHORIZED'])
})
})
context('when the file is uploaded', () => {
before(async () => {
url = 'https://example.com/'
const uploadFile = await getRepository(UploadFile).save({
fileName: 'test.pdf',
contentType: 'application/pdf',
url: url,
user: user,
status: UploadFileStatus.Initialized,
})
uploadFileId = uploadFile.id
})
it('should return the new url', async () => {
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.saveFile.url).to.startsWith(
'http://localhost:3000/fakeUser/links'
)
})
})
})

View File

@ -0,0 +1,94 @@
import { createTestUser, deleteTestUser } from '../db'
import {
createTestElasticPage,
generateFakeUuid,
graphqlRequest,
request,
} from '../util'
import * as chai from 'chai'
import { expect } from 'chai'
import 'mocha'
import { User } from '../../src/entity/user'
import chaiString from 'chai-string'
import { deletePage } from '../../src/elastic'
import { createPubSubClient } from '../../src/datalayer/pubsub'
chai.use(chaiString)
const ctx = { pubsub: createPubSubClient() }
const createHighlightQuery = (
authToken: string,
linkId: string,
highlightId: string,
shortHighlightId: string,
prefix = '_prefix',
suffix = '_suffix',
quote = '_quote',
patch = '_patch'
) => {
return `
mutation {
createHighlight(
input: {
prefix: "${prefix}",
suffix: "${suffix}",
quote: "${quote}",
id: "${highlightId}",
shortId: "${shortHighlightId}",
patch: "${patch}",
articleId: "${linkId}",
}
) {
... on CreateHighlightSuccess {
highlight {
id
}
}
... on CreateHighlightError {
errorCodes
}
}
}
`
}
describe('Highlights API', () => {
const username = 'fakeUser'
let authToken: string
let user: User
let pageId: string
before(async () => {
// create test user and login
user = await createTestUser(username)
const res = await request
.post('/local/debug/fake-user-login')
.send({ fakeEmail: user.email })
authToken = res.body.authToken
pageId = (await createTestElasticPage(user)).id
})
after(async () => {
await deleteTestUser(username)
if (pageId) {
await deletePage(pageId, ctx)
}
})
context('createHighlightMutation', () => {
it('should not fail', async () => {
const highlightId = generateFakeUuid()
const shortHighlightId = '_short_id'
const query = createHighlightQuery(
authToken,
pageId,
highlightId,
shortHighlightId
)
const res = await graphqlRequest(query, authToken).expect(200)
expect(res.body.data.createHighlight.highlight.id).to.eq(highlightId)
})
})
})

View File

@ -1,19 +1,17 @@
import { createTestLabel, createTestUser, deleteTestUser } from '../db'
import {
createTestLabel,
createTestLink,
createTestPage,
createTestUser,
deleteTestUser,
} from '../db'
import { generateFakeUuid, graphqlRequest, request } from '../util'
import { Link } from '../../src/entity/link'
createTestElasticPage,
generateFakeUuid,
graphqlRequest,
request,
} from '../util'
import { Label } from '../../src/entity/label'
import { expect } from 'chai'
import { Page } from '../../src/entity/page'
import { getRepository } from 'typeorm'
import 'mocha'
import { LinkLabel } from '../../src/entity/link_label'
import { User } from '../../src/entity/user'
import { Page } from '../../src/elastic/types'
import { getPageById } from '../../src/elastic'
describe('Labels API', () => {
const username = 'fakeUser'
@ -21,7 +19,6 @@ describe('Labels API', () => {
let user: User
let authToken: string
let page: Page
let link: Link
let labels: Label[]
before(async () => {
@ -38,18 +35,13 @@ describe('Labels API', () => {
const label2 = await createTestLabel(user, 'label_2', '#eeeeee')
labels = [label1, label2]
page = await createTestPage()
link = await createTestLink(user, page)
// create a page with label
const existingLabelOfLink = await createTestLabel(
user,
'different_label',
'#dddddd'
)
// set another label to link
await getRepository(LinkLabel).save({
link,
label: existingLabelOfLink,
})
page = await createTestElasticPage(user, [existingLabelOfLink])
})
after(async () => {
@ -245,7 +237,7 @@ describe('Labels API', () => {
describe('Set labels', () => {
let query: string
let linkId: string
let pageId: string
let labelIds: string[] = []
beforeEach(() => {
@ -253,7 +245,7 @@ describe('Labels API', () => {
mutation {
setLabels(
input: {
linkId: "${linkId}",
linkId: "${pageId}",
labelIds: [
"${labelIds[0]}",
"${labelIds[1]}"
@ -276,22 +268,20 @@ describe('Labels API', () => {
context('when labels exists', () => {
before(() => {
linkId = link.id
pageId = page.id
labelIds = [labels[0].id, labels[1].id]
})
it('should set labels', async () => {
await graphqlRequest(query, authToken).expect(200)
const link = await getRepository(Link).findOne(linkId, {
relations: ['labels'],
})
expect(link?.labels?.map((l) => l.id)).to.eql(labelIds)
const page = await getPageById(pageId)
expect(page?.labels?.map((l) => l.id)).to.eql(labelIds)
})
})
context('when labels not exist', () => {
before(() => {
linkId = link.id
pageId = page.id
labelIds = [generateFakeUuid(), generateFakeUuid()]
})
@ -303,7 +293,7 @@ describe('Labels API', () => {
context('when link not exist', () => {
before(() => {
linkId = generateFakeUuid()
pageId = generateFakeUuid()
labelIds = [labels[0].id, labels[1].id]
})

View File

@ -2,13 +2,13 @@ import {
createTestNewsletterEmail,
createTestUser,
deleteTestUser,
getLink,
} from '../db'
import { request } from '../util'
import { User } from '../../src/entity/user'
import 'mocha'
import * as jwt from 'jsonwebtoken'
import { expect } from 'chai'
import { getPageById } from '../../src/elastic'
describe('PDF attachments Router', () => {
const username = 'fakeUser'
@ -49,7 +49,9 @@ describe('PDF attachments Router', () => {
})
describe('create article', () => {
it('create article with uploaded file id and url', async () => {
let uploadFileId: string
before(async () => {
// upload file first
const testFile = 'testFile.pdf'
const res = await request
@ -59,8 +61,10 @@ describe('PDF attachments Router', () => {
email: newsletterEmail,
fileName: testFile,
})
const uploadFileId = res.body.id
uploadFileId = res.body.id
})
it('create article with uploaded file id and url', async () => {
// create article
const res2 = await request
.post('/svc/pdf-attachments/create-article')
@ -72,7 +76,7 @@ describe('PDF attachments Router', () => {
.expect(200)
expect(res2.body.id).to.be.a('string')
const link = await getLink(res2.body.id)
const link = await getPageById(res2.body.id)
expect(link).to.exist
})

View File

@ -2,6 +2,12 @@ import { createApp } from '../src/server'
import supertest from 'supertest'
import { v4 } from 'uuid'
import { corsConfig } from '../src/utils/corsConfig'
import { Page } from '../src/elastic/types'
import { PageType } from '../src/generated/graphql'
import { createPage } from '../src/elastic'
import { User } from '../src/entity/user'
import { Label } from '../src/entity/label'
import { createPubSubClient } from '../src/datalayer/pubsub'
const { app, apollo } = createApp()
export const request = supertest(app)
@ -28,3 +34,30 @@ export const graphqlRequest = (
export const generateFakeUuid = () => {
return v4()
}
export const createTestElasticPage = async (
user: User,
labels?: Label[]
): Promise<Page> => {
const page: Page = {
id: '',
hash: 'test hash',
userId: user.id,
pageType: PageType.Article,
title: 'test title',
content: '<p>test content</p>',
createdAt: new Date(),
url: 'https://example.com/test-url',
slug: 'test-with-omnivore',
labels: labels,
}
const pageId = await createPage(page, {
pubsub: createPubSubClient(),
refresh: true,
})
if (pageId) {
page.id = pageId
}
return page
}

View File

@ -1,5 +0,0 @@
NEXT_PUBLIC_APP_ENV=local
NEXT_PUBLIC_LOCAL_BASE_URL="http://localhost:3000"
NEXT_PUBLIC_LOCAL_SERVER_BASE_URL="http://localhost:4000"
NEXT_PUBLIC_LOCAL_HIGHLIGHTS_BASE_URL="http://localhost:4000"

View File

@ -26,7 +26,7 @@ export function EditLabelsModal(props: EditLabelsModalProps): JSX.Element {
const { labels } = useGetLabelsQuery()
const saveAndExit = useCallback(async () => {
const result = await setLabelsMutation(props.article.linkId, selectedLabels)
const result = await setLabelsMutation(props.article.id, selectedLabels)
console.log('result of setting labels', result)
props.onOpenChange(false)
props.setLabels(selectedLabels)

View File

@ -1,10 +1,10 @@
import { Box, HStack, VStack } from './../../elements/LayoutPrimitives'
import { useGetLibraryItemsQuery } from '../../../lib/networking/queries/useGetLibraryItemsQuery'
import { useGetViewerQuery } from '../../../lib/networking/queries/useGetViewerQuery'
import type {
LibraryItem,
LibraryItemsQueryInput,
} from '../../../lib/networking/queries/useGetLibraryItemsQuery'
import { useGetLibraryItemsQuery } from '../../../lib/networking/queries/useGetLibraryItemsQuery'
import { useGetViewerQuery } from '../../../lib/networking/queries/useGetViewerQuery'
import {
LinkedItemCard,
LinkedItemCardAction,
@ -19,8 +19,8 @@ import { styled } from '../../tokens/stitches.config'
import { ListLayoutIcon } from '../../elements/images/ListLayoutIcon'
import { GridLayoutIcon } from '../../elements/images/GridLayoutIcon'
import {
searchBarCommands,
libraryListCommands,
searchBarCommands,
} from '../../../lib/keyboardShortcuts/navigationShortcuts'
import { useKeyboardShortcuts } from '../../../lib/keyboardShortcuts/useKeyboardShortcuts'
import { ShareArticleModal } from '../article/ShareArticleModal'
@ -46,7 +46,7 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
const { viewerData } = useGetViewerQuery()
const router = useRouter()
const defaultQuery = {
limit: 20,
limit: 10,
sortDescending: true,
searchQuery: undefined,
}
@ -99,11 +99,10 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
}, [articlesPages])
const libraryItems = useMemo(() => {
const items = (
const items =
articlesPages?.flatMap((ad) => {
return ad.articles.edges
}) || []
)
return items
}, [articlesPages, performActionOnItem])
@ -129,26 +128,26 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
activateCard(firstItem.node.id)
}, [libraryItems])
const activateCard = useCallback((id: string) => {
if (!document.getElementById(id)) {
return;
}
setActiveCardId(id)
scrollToActiveCard(id, true)
}, [libraryItems])
const activateCard = useCallback(
(id: string) => {
if (!document.getElementById(id)) {
return
}
setActiveCardId(id)
scrollToActiveCard(id, true)
},
[libraryItems]
)
const isVisible = function (ele: HTMLElement, container: HTMLElement) {
const eleTop = ele.offsetTop;
const eleBottom = eleTop + ele.clientHeight;
const eleTop = ele.offsetTop
const eleBottom = eleTop + ele.clientHeight
const containerTop = container.scrollTop + 200;
const containerBottom = containerTop + container.clientHeight;
const containerTop = container.scrollTop + 200
const containerBottom = containerTop + container.clientHeight
return (
(eleTop >= containerTop && eleBottom <= containerBottom)
);
};
return eleTop >= containerTop && eleBottom <= containerBottom
}
const scrollToActiveCard = useCallback(
(id: string | null, isSmouth?: boolean): void => {
@ -156,14 +155,17 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
const target = document.getElementById(id)
if (target) {
try {
if (props.scrollElementRef.current && !isVisible(target, props.scrollElementRef.current)) {
if (
props.scrollElementRef.current &&
!isVisible(target, props.scrollElementRef.current)
) {
target.scrollIntoView({
block: 'center',
behavior: isSmouth ? 'smooth' : 'auto',
})
}
target.focus({
preventScroll: true
preventScroll: true,
})
} catch (error) {
console.log('Cannot Scroll', error)
@ -186,9 +188,7 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
return undefined
}
return libraryItems.find(
(item) => item.node.id === activeCardId
)
return libraryItems.find((item) => item.node.id === activeCardId)
}, [libraryItems, activeCardId])
const activeItemIndex = useMemo(() => {
@ -277,7 +277,8 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
break
case 'moveFocusToNextListItem': {
const currentItemIndex = activeItemIndex
const nextItemIndex = currentItemIndex == undefined ? 0 : currentItemIndex + 1
const nextItemIndex =
currentItemIndex == undefined ? 0 : currentItemIndex + 1
const nextItem = libraryItems[nextItemIndex]
if (nextItem) {
activateCard(nextItem.node.id)
@ -286,7 +287,8 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
}
case 'moveFocusToPreviousListItem': {
const currentItemIndex = activeItemIndex
const previousItemIndex = currentItemIndex == undefined ? 0 : currentItemIndex - 1
const previousItemIndex =
currentItemIndex == undefined ? 0 : currentItemIndex - 1
const previousItem = libraryItems[previousItemIndex]
if (previousItem) {
activateCard(previousItem.node.id)
@ -302,9 +304,7 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
)
const nextItem = libraryItems[nextItemIndex]
if (nextItem) {
const nextItemElement = document.getElementById(
nextItem.node.id
)
const nextItemElement = document.getElementById(nextItem.node.id)
if (nextItemElement) {
activateCard(nextItem.node.id)
}
@ -323,9 +323,7 @@ export function HomeFeedContainer(props: HomeFeedContainerProps): JSX.Element {
)
const nextItem = libraryItems[nextItemIndex]
if (nextItem) {
const nextItemElement = document.getElementById(
nextItem.node.id
)
const nextItemElement = document.getElementById(nextItem.node.id)
if (nextItemElement) {
activateCard(nextItem.node.id)
}
@ -589,12 +587,12 @@ function HomeFeedGrid(props: HomeFeedContentProps): JSX.Element {
'&:focus': {
'> div': {
bg: '$grayBgActive',
}
},
},
'&:hover': {
'> div': {
bg: '$grayBgActive',
}
},
},
}}
>
@ -674,7 +672,9 @@ function HomeFeedGrid(props: HomeFeedContentProps): JSX.Element {
toast.success(msg, { position: 'bottom-right' })
})
.catch((error) => {
toast.error('There was an error snoozing your link.', { position: 'bottom-right' })
toast.error('There was an error snoozing your link.', {
position: 'bottom-right',
})
})
}}
onOpenChange={() => {

View File

@ -2,7 +2,7 @@ import { gql } from 'graphql-request'
import { gqlFetcher } from '../networkHelpers'
export async function setLabelsMutation(
linkId: string,
pageId: string,
labelIds: string[]
): Promise<any | undefined> {
const mutation = gql`
@ -21,7 +21,7 @@ export async function setLabelsMutation(
`
try {
const data = await gqlFetcher(mutation, { input: { linkId, labelIds } })
const data = await gqlFetcher(mutation, { input: { pageId, labelIds } })
console.log(data)
return data
} catch (error) {

View File

@ -142,7 +142,8 @@ export function useGetLibraryItemsQuery({
},
(_query, _l, _s, _sq, cursor: string) => {
return gqlFetcher(query, { ...variables, after: cursor }, true)
}
},
{ revalidateFirstPage: false }
)
let responseError = error
@ -153,7 +154,9 @@ export function useGetLibraryItemsQuery({
// we invalidate the data and return the error. We also zero out
// the response in the case of an error.
if (!error && responsePages) {
const errors = responsePages.filter((d) => d.articles.errorCodes && d.articles.errorCodes.length > 0)
const errors = responsePages.filter(
(d) => d.articles.errorCodes && d.articles.errorCodes.length > 0
)
if (errors?.length > 0) {
responseError = errors
responsePages = undefined

223
yarn.lock
View File

@ -106,7 +106,7 @@
semver "^6.3.0"
source-map "^0.5.0"
"@babel/core@^7.12.3", "@babel/core@^7.8.0":
"@babel/core@^7.12.3", "@babel/core@^7.7.5", "@babel/core@^7.8.0":
version "7.17.5"
resolved "https://registry.yarnpkg.com/@babel/core/-/core-7.17.5.tgz#6cd2e836058c28f06a4ca8ee7ed955bbf37c8225"
integrity sha512-/BBMw4EvjmyquN5O+t5eh0+YqB3XXJkYD2cjKpYtWOfFy4lQ4UozNSmxAcWT8r2XtZs0ewG+zrfsqeR15i1ajA==
@ -1041,6 +1041,17 @@
enabled "2.0.x"
kuler "^2.0.0"
"@elastic/elasticsearch@~7.12.0":
version "7.12.0"
resolved "https://registry.yarnpkg.com/@elastic/elasticsearch/-/elasticsearch-7.12.0.tgz#dbb51a2841f644b670a56d8c15899e860928856f"
integrity sha512-GquUEytCijFRPEk3DKkkDdyhspB3qbucVQOwih9uNyz3iz804I+nGBUsFo2LwVvLQmQfEM0IY2+yoYfEz5wMug==
dependencies:
debug "^4.3.1"
hpagent "^0.1.1"
ms "^2.1.3"
pump "^3.0.0"
secure-json-parse "^2.3.1"
"@endemolshinegroup/cosmiconfig-typescript-loader@3.0.2":
version "3.0.2"
resolved "https://registry.yarnpkg.com/@endemolshinegroup/cosmiconfig-typescript-loader/-/cosmiconfig-typescript-loader-3.0.2.tgz#eea4635828dde372838b0909693ebd9aafeec22d"
@ -1871,6 +1882,13 @@
js-yaml "^3.13.1"
resolve-from "^5.0.0"
"@istanbuljs/nyc-config-typescript@^1.0.2":
version "1.0.2"
resolved "https://registry.yarnpkg.com/@istanbuljs/nyc-config-typescript/-/nyc-config-typescript-1.0.2.tgz#1f5235b28540a07219ae0dd42014912a0b19cf89"
integrity sha512-iKGIyMoyJuFnJRSVTZ78POIRvNnwZaWIf8vG4ZS3rQq58MMDrqEX2nnzx0R28V2X8JvmKYiqY9FP2hlJsm8A0w==
dependencies:
"@istanbuljs/schema" "^0.1.2"
"@istanbuljs/schema@^0.1.2":
version "0.1.3"
resolved "https://registry.yarnpkg.com/@istanbuljs/schema/-/schema-0.1.3.tgz#e45e384e4b8ec16bce2fd903af78450f6bf7ec98"
@ -6207,6 +6225,13 @@ app-root-path@^3.0.0:
resolved "https://registry.yarnpkg.com/app-root-path/-/app-root-path-3.0.0.tgz#210b6f43873227e18a4b810a032283311555d5ad"
integrity sha512-qMcx+Gy2UZynHjOHOIXPNvpf+9cjvk3cWrBBK7zg4gH9+clobJRb9NGzcT7mQTcV/6Gm/1WelUtqxVXnNlrwcw==
append-transform@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/append-transform/-/append-transform-2.0.0.tgz#99d9d29c7b38391e6f428d28ce136551f0b77e12"
integrity sha512-7yeyCEurROLQJFv5Xj4lEGTy0borxepjFv1g22oAdqFu//SrAlDl1O1Nxx15SH1RoliUml6p8dwJW9jvZughhg==
dependencies:
default-require-extensions "^3.0.0"
aproba@^1.0.3:
version "1.2.0"
resolved "https://registry.yarnpkg.com/aproba/-/aproba-1.2.0.tgz#6802e6264efd18c790a1b0d517f0f2627bf2c94a"
@ -6217,6 +6242,11 @@ aproba@^2.0.0:
resolved "https://registry.yarnpkg.com/aproba/-/aproba-2.0.0.tgz#52520b8ae5b569215b354efc0caa3fe1e45a8adc"
integrity sha512-lYe4Gx7QT+MKGbDsA+Z+he/Wtef0BiwDOlK/XkBrdfsh9J/jPPXbX0tE9x9cl27Tmu5gg3QUbUrQYa/y+KOHPQ==
archy@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/archy/-/archy-1.0.0.tgz#f9c8c13757cc1dd7bc379ac77b2c62a5c2868c40"
integrity sha1-+cjBN1fMHde8N5rHeyxipcKGjEA=
are-we-there-yet@~1.1.2:
version "1.1.5"
resolved "https://registry.yarnpkg.com/are-we-there-yet/-/are-we-there-yet-1.1.5.tgz#4b35c2944f062a8bfcda66410760350fe9ddfc21"
@ -6915,6 +6945,16 @@ cacheable-request@^6.0.0:
normalize-url "^4.1.0"
responselike "^1.0.2"
caching-transform@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/caching-transform/-/caching-transform-4.0.0.tgz#00d297a4206d71e2163c39eaffa8157ac0651f0f"
integrity sha512-kpqOvwXnjjN44D89K5ccQC+RUrsy7jB/XLlRrx0D7/2HNcTPqzsb6XgYoErwko6QsV184CA2YgS1fxDiiDZMWA==
dependencies:
hasha "^5.0.0"
make-dir "^3.0.0"
package-hash "^4.0.0"
write-file-atomic "^3.0.0"
call-bind@^1.0.0, call-bind@^1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/call-bind/-/call-bind-1.0.2.tgz#b1d4e89e688119c3c9a903ad30abb2f6a919be3c"
@ -7748,7 +7788,7 @@ cross-fetch@^3.1.5:
dependencies:
node-fetch "2.6.7"
cross-spawn@^7.0.2, cross-spawn@^7.0.3:
cross-spawn@^7.0.0, cross-spawn@^7.0.2, cross-spawn@^7.0.3:
version "7.0.3"
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-7.0.3.tgz#f73a85b9d5d41d045551c177e2882d4ac85728a6"
integrity sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==
@ -8036,6 +8076,13 @@ deepmerge@^4.2.2:
resolved "https://registry.yarnpkg.com/deepmerge/-/deepmerge-4.2.2.tgz#44d2ea3679b8f4d4ffba33f03d865fc1e7bf4955"
integrity sha512-FJ3UgI4gIl+PHZm53knsuSFpE+nESMr7M4v9QcgB7S63Kj/6WqMiFQJpBBYz1Pt+66bZpP3Q7Lye0Oo9MPKEdg==
default-require-extensions@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/default-require-extensions/-/default-require-extensions-3.0.0.tgz#e03f93aac9b2b6443fc52e5e4a37b3ad9ad8df96"
integrity sha512-ek6DpXq/SCpvjhpFsLFRVtIxJCRw6fUR42lYMVZuUMK7n8eMz4Uh5clckdBjEpLhn/gEBZo7hDJnJcwdKLKQjg==
dependencies:
strip-bom "^4.0.0"
defaults@^1.0.3:
version "1.0.3"
resolved "https://registry.yarnpkg.com/defaults/-/defaults-1.0.3.tgz#c656051e9817d9ff08ed881477f3fe4019f3ef7d"
@ -8581,6 +8628,11 @@ es-to-primitive@^1.2.1:
is-date-object "^1.0.1"
is-symbol "^1.0.2"
es6-error@^4.0.1:
version "4.1.1"
resolved "https://registry.yarnpkg.com/es6-error/-/es6-error-4.1.1.tgz#9e3af407459deed47e9a91f9b885a84eb05c561d"
integrity sha512-Um/+FxMr9CISWh0bi5Zv0iOD+4cFh5qLeks1qhAopKVAJw3drgKbKySikp7wGhDL0HPeaja0P5ULZrxLkniUVg==
escalade@^3.1.1:
version "3.1.1"
resolved "https://registry.yarnpkg.com/escalade/-/escalade-3.1.1.tgz#d8cfdc7000965c5a0174b4a82eaa5c0552742e40"
@ -9270,6 +9322,15 @@ find-cache-dir@^2.0.0:
make-dir "^2.0.0"
pkg-dir "^3.0.0"
find-cache-dir@^3.2.0:
version "3.3.2"
resolved "https://registry.yarnpkg.com/find-cache-dir/-/find-cache-dir-3.3.2.tgz#b30c5b6eff0730731aea9bbd9dbecbd80256d64b"
integrity sha512-wXZV5emFEjrridIgED11OoUKLxiYjAcqot/NJdAkOhlJ+vGzwhOAfcG5OX1jP+S0PcjEn8bdMJv+g2jwQ3Onig==
dependencies:
commondir "^1.0.1"
make-dir "^3.0.2"
pkg-dir "^4.1.0"
find-up@5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/find-up/-/find-up-5.0.0.tgz#4c92819ecb7083561e4f4a240a86be5198f536fc"
@ -9404,6 +9465,14 @@ for-own@^1.0.0:
dependencies:
for-in "^1.0.1"
foreground-child@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/foreground-child/-/foreground-child-2.0.0.tgz#71b32800c9f15aa8f2f83f4a6bd9bff35d861a53"
integrity sha512-dCIq9FpEcyQyXKCkyzmlPTFNgrCzPudOe+mhvJU5zAtlBnGVy2yKxtfsxK2tQBThwq225jcvBjpw1Gr40uzZCA==
dependencies:
cross-spawn "^7.0.0"
signal-exit "^3.0.2"
forever-agent@~0.6.1:
version "0.6.1"
resolved "https://registry.yarnpkg.com/forever-agent/-/forever-agent-0.6.1.tgz#fbc71f0c41adeb37f96c577ad1ed42d8fdacca91"
@ -9485,6 +9554,11 @@ fresh@0.5.2:
resolved "https://registry.yarnpkg.com/fresh/-/fresh-0.5.2.tgz#3d8cadd90d976569fa835ab1f8e4b23a105605a7"
integrity sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac=
fromentries@^1.2.0:
version "1.3.2"
resolved "https://registry.yarnpkg.com/fromentries/-/fromentries-1.3.2.tgz#e4bca6808816bf8f93b52750f1127f5a6fd86e3a"
integrity sha512-cHEpEQHUg0f8XdtZCc2ZAhrHzKzT0MrFUTcvx+hfxYu7rGMDc5SKoXFh+n4YigxsHXRzc6OrCshdR1bWH6HHyg==
fs-constants@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/fs-constants/-/fs-constants-1.0.0.tgz#6be0de9be998ce16af8afc24497b9ee9b7ccd9ad"
@ -10163,6 +10237,14 @@ hash-stream-validation@^0.2.2:
resolved "https://registry.yarnpkg.com/hash-stream-validation/-/hash-stream-validation-0.2.4.tgz#ee68b41bf822f7f44db1142ec28ba9ee7ccb7512"
integrity sha512-Gjzu0Xn7IagXVkSu9cSFuK1fqzwtLwFhNhVL8IFJijRNMgUttFbBSIAzKuSIrsFMO1+g1RlsoN49zPIbwPDMGQ==
hasha@^5.0.0:
version "5.2.2"
resolved "https://registry.yarnpkg.com/hasha/-/hasha-5.2.2.tgz#a48477989b3b327aea3c04f53096d816d97522a1"
integrity sha512-Hrp5vIK/xr5SkeN2onO32H0MgNZ0f17HRNH39WfL0SYUNOTZ5Lz1TJ8Pajo/87dYGEFlLMm7mIc/k/s6Bvz9HQ==
dependencies:
is-stream "^2.0.0"
type-fest "^0.8.0"
he@1.2.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/he/-/he-1.2.0.tgz#84ae65fa7eafb165fddb61566ae14baf05664f0f"
@ -10235,6 +10317,11 @@ hosted-git-info@^4.0.0, hosted-git-info@^4.0.1:
dependencies:
lru-cache "^6.0.0"
hpagent@^0.1.1:
version "0.1.2"
resolved "https://registry.yarnpkg.com/hpagent/-/hpagent-0.1.2.tgz#cab39c66d4df2d4377dbd212295d878deb9bdaa9"
integrity sha512-ePqFXHtSQWAFXYmj+JtOTHr84iNrII4/QRlAAPPE+zqnKy4xJo7Ie1Y4kC7AdB+LxLxSTTzBMASsEcy0q8YyvQ==
html-encoding-sniffer@^1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/html-encoding-sniffer/-/html-encoding-sniffer-1.0.2.tgz#e70d84b94da53aa375e11fe3a351be6642ca46f8"
@ -11121,11 +11208,28 @@ istanbul-lib-coverage@^3.0.0:
resolved "https://registry.yarnpkg.com/istanbul-lib-coverage/-/istanbul-lib-coverage-3.0.0.tgz#f5944a37c70b550b02a78a5c3b2055b280cec8ec"
integrity sha512-UiUIqxMgRDET6eR+o5HbfRYP1l0hqkWOs7vNxC/mggutCMUIhWMm8gAHb8tHlyfD3/l6rlgNA5cKdDzEAf6hEg==
istanbul-lib-coverage@^3.2.0:
istanbul-lib-coverage@^3.0.0-alpha.1, istanbul-lib-coverage@^3.2.0:
version "3.2.0"
resolved "https://registry.yarnpkg.com/istanbul-lib-coverage/-/istanbul-lib-coverage-3.2.0.tgz#189e7909d0a39fa5a3dfad5b03f71947770191d3"
integrity sha512-eOeJ5BHCmHYvQK7xt9GkdHuzuCGS1Y6g9Gvnx3Ym33fz/HpLRYxiS0wHNr+m/MBC8B647Xt608vCDEvhl9c6Mw==
istanbul-lib-hook@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/istanbul-lib-hook/-/istanbul-lib-hook-3.0.0.tgz#8f84c9434888cc6b1d0a9d7092a76d239ebf0cc6"
integrity sha512-Pt/uge1Q9s+5VAZ+pCo16TYMWPBIl+oaNIjgLQxcX0itS6ueeaA+pEfThZpH8WxhFgCiEb8sAJY6MdUKgiIWaQ==
dependencies:
append-transform "^2.0.0"
istanbul-lib-instrument@^4.0.0:
version "4.0.3"
resolved "https://registry.yarnpkg.com/istanbul-lib-instrument/-/istanbul-lib-instrument-4.0.3.tgz#873c6fff897450118222774696a3f28902d77c1d"
integrity sha512-BXgQl9kf4WTCPCCpmFGoJkz/+uhvm7h7PFKUYxh7qarQd3ER33vHG//qaE8eN25l07YqZPpHXU9I09l/RD5aGQ==
dependencies:
"@babel/core" "^7.7.5"
"@istanbuljs/schema" "^0.1.2"
istanbul-lib-coverage "^3.0.0"
semver "^6.3.0"
istanbul-lib-instrument@^5.0.4, istanbul-lib-instrument@^5.1.0:
version "5.1.0"
resolved "https://registry.yarnpkg.com/istanbul-lib-instrument/-/istanbul-lib-instrument-5.1.0.tgz#7b49198b657b27a730b8e9cb601f1e1bff24c59a"
@ -11137,6 +11241,19 @@ istanbul-lib-instrument@^5.0.4, istanbul-lib-instrument@^5.1.0:
istanbul-lib-coverage "^3.2.0"
semver "^6.3.0"
istanbul-lib-processinfo@^2.0.2:
version "2.0.2"
resolved "https://registry.yarnpkg.com/istanbul-lib-processinfo/-/istanbul-lib-processinfo-2.0.2.tgz#e1426514662244b2f25df728e8fd1ba35fe53b9c"
integrity sha512-kOwpa7z9hme+IBPZMzQ5vdQj8srYgAtaRqeI48NGmAQ+/5yKiHLV0QbYqQpxsdEF0+w14SoB8YbnHKcXE2KnYw==
dependencies:
archy "^1.0.0"
cross-spawn "^7.0.0"
istanbul-lib-coverage "^3.0.0-alpha.1"
make-dir "^3.0.0"
p-map "^3.0.0"
rimraf "^3.0.0"
uuid "^3.3.3"
istanbul-lib-report@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/istanbul-lib-report/-/istanbul-lib-report-3.0.0.tgz#7518fe52ea44de372f460a76b5ecda9ffb73d8a6"
@ -11155,7 +11272,7 @@ istanbul-lib-source-maps@^4.0.0:
istanbul-lib-coverage "^3.0.0"
source-map "^0.6.1"
istanbul-reports@^3.1.3:
istanbul-reports@^3.0.2, istanbul-reports@^3.1.3:
version "3.1.4"
resolved "https://registry.yarnpkg.com/istanbul-reports/-/istanbul-reports-3.1.4.tgz#1b6f068ecbc6c331040aab5741991273e609e40c"
integrity sha512-r1/DshN4KSE7xWEknZLLLLDn5CJybV3nw01VTkp6D5jzLuELlcbudfj/eSQFvrKsJuTVCGnePO7ho82Nw9zzfw==
@ -12309,6 +12426,11 @@ lodash.escaperegexp@^4.1.2:
resolved "https://registry.yarnpkg.com/lodash.escaperegexp/-/lodash.escaperegexp-4.1.2.tgz#64762c48618082518ac3df4ccf5d5886dae20347"
integrity sha1-ZHYsSGGAglGKw99Mz11YhtriA0c=
lodash.flattendeep@^4.4.0:
version "4.4.0"
resolved "https://registry.yarnpkg.com/lodash.flattendeep/-/lodash.flattendeep-4.4.0.tgz#fb030917f86a3134e5bc9bec0d69e0013ddfedb2"
integrity sha1-+wMJF/hqMTTlvJvsDWngAT3f7bI=
lodash.get@^4, lodash.get@^4.4.2:
version "4.4.2"
resolved "https://registry.yarnpkg.com/lodash.get/-/lodash.get-4.4.2.tgz#2d177f652fa31e939b4438d5341499dfa3825e99"
@ -12597,7 +12719,7 @@ make-dir@^2.0.0, make-dir@^2.1.0:
pify "^4.0.1"
semver "^5.6.0"
make-dir@^3.0.0:
make-dir@^3.0.0, make-dir@^3.0.2:
version "3.1.0"
resolved "https://registry.yarnpkg.com/make-dir/-/make-dir-3.1.0.tgz#415e967046b3a7f1d185277d84aa58203726a13f"
integrity sha512-g3FeP20LNwhALb/6Cz6Dd4F2ngze0jz7tbzrD2wAV+o9FeNHe4rL+yK2md0J/fiSf1sa1ADhXqi5+oVwOM/eGw==
@ -13102,7 +13224,7 @@ ms@2.1.2, ms@^2.1.1:
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.2.tgz#d09d1f357b443f493382a8eb3ccd183872ae6009"
integrity sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w==
ms@2.1.3, ms@^2.0.0:
ms@2.1.3, ms@^2.0.0, ms@^2.1.3:
version "2.1.3"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.3.tgz#574c8138ce1d2b5861f0b44579dbadd60c6615b2"
integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==
@ -13357,6 +13479,13 @@ node-plop@~0.26.2:
mkdirp "^0.5.1"
resolve "^1.12.0"
node-preload@^0.2.1:
version "0.2.1"
resolved "https://registry.yarnpkg.com/node-preload/-/node-preload-0.2.1.tgz#c03043bb327f417a18fee7ab7ee57b408a144301"
integrity sha512-RM5oyBy45cLEoHqCeh+MNuFAxO0vTFBLskvQbOKnEE7YTTSN4tbN8QWDIPQ6L+WvKsB/qLEGpYe2ZZ9d4W9OIQ==
dependencies:
process-on-spawn "^1.0.0"
node-releases@^1.1.75:
version "1.1.75"
resolved "https://registry.yarnpkg.com/node-releases/-/node-releases-1.1.75.tgz#6dd8c876b9897a1b8e5a02de26afa79bb54ebbfe"
@ -13567,6 +13696,39 @@ nwsapi@^2.0.9, nwsapi@^2.2.0:
resolved "https://registry.yarnpkg.com/nwsapi/-/nwsapi-2.2.0.tgz#204879a9e3d068ff2a55139c2c772780681a38b7"
integrity sha512-h2AatdwYH+JHiZpv7pt/gSX1XoRGb7L/qSIeuqA6GwYoF9w1vP1cw42TO0aI2pNyshRK5893hNSl+1//vHK7hQ==
nyc@^15.1.0:
version "15.1.0"
resolved "https://registry.yarnpkg.com/nyc/-/nyc-15.1.0.tgz#1335dae12ddc87b6e249d5a1994ca4bdaea75f02"
integrity sha512-jMW04n9SxKdKi1ZMGhvUTHBN0EICCRkHemEoE5jm6mTYcqcdas0ATzgUgejlQUHMvpnOZqGB5Xxsv9KxJW1j8A==
dependencies:
"@istanbuljs/load-nyc-config" "^1.0.0"
"@istanbuljs/schema" "^0.1.2"
caching-transform "^4.0.0"
convert-source-map "^1.7.0"
decamelize "^1.2.0"
find-cache-dir "^3.2.0"
find-up "^4.1.0"
foreground-child "^2.0.0"
get-package-type "^0.1.0"
glob "^7.1.6"
istanbul-lib-coverage "^3.0.0"
istanbul-lib-hook "^3.0.0"
istanbul-lib-instrument "^4.0.0"
istanbul-lib-processinfo "^2.0.2"
istanbul-lib-report "^3.0.0"
istanbul-lib-source-maps "^4.0.0"
istanbul-reports "^3.0.2"
make-dir "^3.0.0"
node-preload "^0.2.1"
p-map "^3.0.0"
process-on-spawn "^1.0.0"
resolve-from "^5.0.0"
rimraf "^3.0.0"
signal-exit "^3.0.2"
spawn-wrap "^2.0.0"
test-exclude "^6.0.0"
yargs "^15.0.2"
oauth-sign@~0.9.0:
version "0.9.0"
resolved "https://registry.yarnpkg.com/oauth-sign/-/oauth-sign-0.9.0.tgz#47a7b016baa68b5fa0ecf3dee08a85c679ac6455"
@ -13953,6 +14115,16 @@ p-waterfall@^2.1.1:
dependencies:
p-reduce "^2.0.0"
package-hash@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/package-hash/-/package-hash-4.0.0.tgz#3537f654665ec3cc38827387fc904c163c54f506"
integrity sha512-whdkPIooSu/bASggZ96BWVvZTRMOFxnyUG5PnTSGKoJE2gd5mbVNmR2Nj20QFzxYYgAXpoqC+AiXzl+UMRh7zQ==
dependencies:
graceful-fs "^4.1.15"
hasha "^5.0.0"
lodash.flattendeep "^4.4.0"
release-zalgo "^1.0.0"
package-json@^6.3.0:
version "6.5.0"
resolved "https://registry.yarnpkg.com/package-json/-/package-json-6.5.0.tgz#6feedaca35e75725876d0b0e64974697fed145b0"
@ -14352,7 +14524,7 @@ pirates@^4.0.4:
resolved "https://registry.yarnpkg.com/pirates/-/pirates-4.0.5.tgz#feec352ea5c3268fb23a37c702ab1699f35a5f3b"
integrity sha512-8V9+HQPupnaXMA23c5hvl69zXvTwTzyAYasnkb0Tts4XvO4CliqONMOnvlq26rkhLC3nWDFBJf73LU1e1VZLaQ==
pkg-dir@4.2.0, pkg-dir@^4.2.0:
pkg-dir@4.2.0, pkg-dir@^4.1.0, pkg-dir@^4.2.0:
version "4.2.0"
resolved "https://registry.yarnpkg.com/pkg-dir/-/pkg-dir-4.2.0.tgz#f099133df7ede422e81d1d8448270eeb3e4261f3"
integrity sha512-HRDzbaKjC+AOWVXxAU/x54COGeIv9eb+6CkDSQoNTt4XyWoIJvuPsXizxu/Fr23EiekbtZwmh1IcIG/l/a10GQ==
@ -14495,6 +14667,13 @@ process-nextick-args@~2.0.0:
resolved "https://registry.yarnpkg.com/process-nextick-args/-/process-nextick-args-2.0.1.tgz#7820d9b16120cc55ca9ae7792680ae7dba6d7fe2"
integrity sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==
process-on-spawn@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/process-on-spawn/-/process-on-spawn-1.0.0.tgz#95b05a23073d30a17acfdc92a440efd2baefdc93"
integrity sha512-1WsPDsUSMmZH5LeMLegqkPDrsGgsWwk1Exipy2hvB0o/F0ASzbpIctSCcZIK1ykJvtTJULEH+20WOFjMvGnCTg==
dependencies:
fromentries "^1.2.0"
process@^0.10.0:
version "0.10.1"
resolved "https://registry.yarnpkg.com/process/-/process-0.10.1.tgz#842457cc51cfed72dc775afeeafb8c6034372725"
@ -15127,6 +15306,13 @@ relay-runtime@12.0.0:
fbjs "^3.0.0"
invariant "^2.2.4"
release-zalgo@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/release-zalgo/-/release-zalgo-1.0.0.tgz#09700b7e5074329739330e535c5a90fb67851730"
integrity sha1-CXALflB0Mpc5Mw5TXFqQ+2eFFzA=
dependencies:
es6-error "^4.0.1"
remedial@^1.0.7:
version "1.0.8"
resolved "https://registry.yarnpkg.com/remedial/-/remedial-1.0.8.tgz#a5e4fd52a0e4956adbaf62da63a5a46a78c578a0"
@ -15440,6 +15626,11 @@ search-query-parser@^1.6.0:
resolved "https://registry.yarnpkg.com/search-query-parser/-/search-query-parser-1.6.0.tgz#d69ade33f3685cae25613a70189b7b18970b46f1"
integrity sha512-bhf+phLlKF38nuniwLcVHWPArHGdzenlPhPi955CR3vm1QQifXIuPHwAffhjapojdVVzmv4hgIJ6NOX1d/w+Uw==
secure-json-parse@^2.3.1:
version "2.4.0"
resolved "https://registry.yarnpkg.com/secure-json-parse/-/secure-json-parse-2.4.0.tgz#5aaeaaef85c7a417f76271a4f5b0cc3315ddca85"
integrity sha512-Q5Z/97nbON5t/L/sH6mY2EacfjVGwrCcSi5D3btRO2GZ8pf1K1UN7Z9H5J57hjVU2Qzxr1xO+FmBhOvEkzCMmg==
semver-diff@^3.1.1:
version "3.1.1"
resolved "https://registry.yarnpkg.com/semver-diff/-/semver-diff-3.1.1.tgz#05f77ce59f325e00e2706afd67bb506ddb1ca32b"
@ -15807,6 +15998,18 @@ spawn-command@^0.0.2-1:
resolved "https://registry.yarnpkg.com/spawn-command/-/spawn-command-0.0.2-1.tgz#62f5e9466981c1b796dc5929937e11c9c6921bd0"
integrity sha1-YvXpRmmBwbeW3Fkpk34RycaSG9A=
spawn-wrap@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/spawn-wrap/-/spawn-wrap-2.0.0.tgz#103685b8b8f9b79771318827aa78650a610d457e"
integrity sha512-EeajNjfN9zMnULLwhZZQU3GWBoFNkbngTUPfaawT4RkMiviTxcX0qfhVbGey39mfctfDHkWtuecgQ8NJcyQWHg==
dependencies:
foreground-child "^2.0.0"
is-windows "^1.0.2"
make-dir "^3.0.0"
rimraf "^3.0.0"
signal-exit "^3.0.2"
which "^2.0.1"
spdx-correct@^3.0.0:
version "3.1.1"
resolved "https://registry.yarnpkg.com/spdx-correct/-/spdx-correct-3.1.1.tgz#dece81ac9c1e6713e5f7d1b6f17d468fa53d89a9"
@ -16769,7 +16972,7 @@ type-fest@^0.6.0:
resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.6.0.tgz#8d2a2370d3df886eb5c90ada1c5bf6188acf838b"
integrity sha512-q+MB8nYR1KDLrgr4G5yemftpMC7/QLqVndBmEEdqzmNj5dcFOO4Oo8qlwZE3ULT3+Zim1F8Kq4cBnikNhlCMlg==
type-fest@^0.8.1:
type-fest@^0.8.0, type-fest@^0.8.1:
version "0.8.1"
resolved "https://registry.yarnpkg.com/type-fest/-/type-fest-0.8.1.tgz#09e249ebde851d3b1e48d27c105444667f17b83d"
integrity sha512-4dbzIzqvjtgiM5rw1k5rEHtBANKmdudhGyBEajN01fEyhaAIhsoKNy6y7+IN93IfpFtwY9iqi7kD+xwKhQsNJA==
@ -17112,7 +17315,7 @@ utils-merge@1.0.1:
resolved "https://registry.yarnpkg.com/utils-merge/-/utils-merge-1.0.1.tgz#9f95710f50a267947b2ccc124741c1028427e713"
integrity sha1-n5VxD1CiZ5R7LMwSR0HBAoQn5xM=
uuid@^3.2.1, uuid@^3.3.2:
uuid@^3.2.1, uuid@^3.3.2, uuid@^3.3.3:
version "3.4.0"
resolved "https://registry.yarnpkg.com/uuid/-/uuid-3.4.0.tgz#b23e4358afa8a202fe7a100af1f5f883f02007ee"
integrity sha512-HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A==
@ -17671,7 +17874,7 @@ yargs@16.2.0, yargs@^16.0.0, yargs@^16.1.1, yargs@^16.2.0:
y18n "^5.0.5"
yargs-parser "^20.2.2"
yargs@^15.3.1:
yargs@^15.0.2, yargs@^15.3.1:
version "15.4.1"
resolved "https://registry.yarnpkg.com/yargs/-/yargs-15.4.1.tgz#0d87a16de01aee9d8bec2bfbf74f67851730f4f8"
integrity sha512-aePbxDmcYW++PaqBsJ+HYUFwCdv4LVvdnhBy78E57PIor8/OVvhMrADFFEDh8DHDFRv/O9i3lPhsENjO7QX0+A==