Skip to content

Commit

Permalink
Integrate Algolia Search (#144)
Browse files Browse the repository at this point in the history
New Features
- Automated data push to Algolia search engine via GitHub Actions.
- Added new Hugo theme submodule for Algolia integration.
- Enhanced security and access control in device provisioning service.
- Updated disaster recovery documentation with improved backup approaches and DNS/IP handling.
- Added HTML structure to the main website layout for better content rendering.
- Introduced a script for managing data push to Algolia.
Bug Fixes
- Corrected a typo in the dashboard branding tutorial.
- Fixed a formatting issue in the device provisioning service documentation.
Chores
- Updated .gitignore and .gitmodules configurations.
- Updated subproject commit references for theme integration.

---------

Co-authored-by: Patrik Matiaško <[email protected]>
Co-authored-by: Jozef Kralik <[email protected]>
  • Loading branch information
3 people authored Mar 13, 2024
1 parent ae4d0c9 commit bbae8a4
Show file tree
Hide file tree
Showing 14 changed files with 1,977 additions and 40 deletions.
51 changes: 51 additions & 0 deletions .github/workflows/algolia.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Algolia Push

#tmp just for developing
on:
push:

jobs:
build:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4
with:
# fetch branches and history so `git merge-base` in check-format-on-diff works correctly
fetch-depth: 0
- uses: actions/setup-go@v5
with:
go-version: "^1.20" # The Go version to download (if necessary) and use.
check-latest: true

- name: Install npm dependencies
run: npm install -g --legacy-peer-deps hugo-algolia

- name: Check if hugo-algolia is installed
run: hugo-algolia --version

- name: Generate algolia.json
run: |
mkdir -p public
hugo-algolia --config config/_default/config.yaml
- name: Build Go binary
run: |
cd tools/adjust-algolia-output
go build -o /usr/local/bin/adjust-algolia-output
- name: Adjust algolia.json using binary
run: adjust-algolia-output < ./public/algolia.json | jq > public/algolia_final.json

- name: Install Node.js
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
uses: actions/setup-node@v2
with:
node-version: '14'

- name: Push to Algolia
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: |
cd scripts
npm install algoliasearch
APPLICATION_ID=${{ secrets.ALGOLIA_APPLICATION_ID }} API_KEY=${{ secrets.ALGOLIA_API_KEY }} node push-to-algolia.js ../public/algolia_final.json
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,5 @@ nohup.out
trace.out
.idea
node_modules
.hugo_build.lock
.hugo_build.lock
adjust-algolia-output
18 changes: 15 additions & 3 deletions config/_default/config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ baseURL: 'https://docs.plgd.dev/'
theme: 'plgd'
title: 'plgd docs'
defaultContentLanguage: en
defaultContentLanguageInSubdir: false
enableEmoji: true
footnotereturnlinkcontents:
languageCode: en-us
Expand All @@ -17,16 +18,14 @@ languages:
# weight: 2
copyright: 'All Rights Reserved © 2020-{year} plgd.dev, s.r.o.'
googleAnalytics: 'GTM-5B8C4RK'
algolia:
index: "doc"
appID: "42D6VHXINQ"
pluralizeListTitles: false
outputs:
home:
- HTML
- RSS
- REDIR
- HEADERS
- ALGOLIA
section:
- HTML
- RSS
Expand All @@ -44,6 +43,15 @@ outputFormats:
baseName: _headers
isPlainText: true
notAlternative: true
Algolia:
baseName: "algolia"
isPlainText: true
mediaType: "application/json"
notAlternative: true
JSON:
mediaType: "application/json"
baseName: "data"
isPlainText: true
caches:
getjson:
dir: ':cacheDir/:project'
Expand Down Expand Up @@ -75,4 +83,8 @@ taxonomies:
category: categories
params:
githubRepository: 'https://github.com/plgd-dev/doc/'
algolia:
indexName: 'doc'
apiKey: '31dbe24685b8a1a7025c12098b32df37'
appId: '42D6VHXINQ'
---
54 changes: 27 additions & 27 deletions content/en/docs/deployment/device-provisioning-service/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,43 +47,43 @@ In the process of acquiring a device access token from the OAuth server, the Dev
1. Create an OAuth client for DPS in KeyCloak with the following configuration:

- Settings:
- Enabled: On
- Client Protocol: openid-connect
- Access Type: confidential
- Service Accounts Enabled: On
- Authorization Enabled: On
- Enabled: On
- Client Protocol: openid-connect
- Access Type: confidential
- Service Accounts Enabled: On
- Authorization Enabled: On

- Credentials:
- Client Authenticator: Client Id and Secret
- Secret: `<MY_DPS_CLIENT_SECRET>`
- Client Authenticator: Client Id and Secret
- Secret: `<MY_DPS_CLIENT_SECRET>`

- Mapper:
- Create a custom `Hardcoded claim` mapper:
- Token Claim Name: `<OWNER_CLAIM>`
- Claim value: `<OWNER>`
- Claim JSON Type: String
- Add to access token: On
- Add to userinfo: On
- Create a custom `Hardcoded claim` mapper:
- Token Claim Name: `<OWNER_CLAIM>`
- Claim value: `<OWNER>`
- Claim JSON Type: String
- Add to access token: On
- Add to userinfo: On

2. Create a WWW OAuth client with a mapper that adds the `<OWNER_CLAIM>` claim to the user JWT token. You can use the `User Property` mapper with the following configuration to map the `id` property to the `<OWNER_CLAIM>` claim:

- Settings:
- Enabled: On
- Client Protocol: openid-connect
- Access Type: public
- Standard Flow Enabled: On
- Valid Redirect URIs: `[ https://www.example.com/*,... ]`
- Backchannel Logout Session Required: On
- OpenID Connect Compatibility Modes:
- Use Refresh Tokens: On
- Enabled: On
- Client Protocol: openid-connect
- Access Type: public
- Standard Flow Enabled: On
- Valid Redirect URIs: `[ https://www.example.com/*,... ]`
- Backchannel Logout Session Required: On
- OpenID Connect Compatibility Modes:
- Use Refresh Tokens: On

- Mapper:
- Property: id
- Token Claim Name: `<OWNER_CLAIM>`
- Claim JSON Type: String
- Add to ID token: On
- Add to access token: On
- Add to userinfo: On
- Property: id
- Token Claim Name: `<OWNER_CLAIM>`
- Claim JSON Type: String
- Add to ID token: On
- Add to access token: On
- Add to userinfo: On

In the helm chart, add the following configuration:

Expand Down
2 changes: 1 addition & 1 deletion content/en/docs/deployment/hub/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ certmanager:
cert:
duration: 876000h # 100 years for intermediate CA used to sign device certificates
ca: # CA to signing services(in default) and device certificates
issuerRef:
issuerRef:
kind: "ClusterIssuer" # or "Issuer"
name: "plgd-ca-issuer"
group: cert-manager.io
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ The plgd hub is a stateful event-driven system, meaning that data is stored in t
To back up the database, two approaches can be used:

* **Passive Backup**

![passive-backup](/docs/features/monitoring-and-diagnostics/static/disaster-recovery-passive-backup.drawio.svg)

The database is regularly backed up to a different location and can be used in case of failure. Although this approach is simple and requires fewer resources, the data may become outdated, and the restoration process may take some time. For MongoDB, utilize the `mongodump` tool to create a export of the database contents, store it securely, and use it in case of failure. Regular backups are essential to keep the data up-to-date. For more details on this approach, refer to the [MongoDB documentation](https://www.mongodb.com/docs/database-tools/mongodump/).
Expand Down Expand Up @@ -93,25 +93,24 @@ The CoAP-Gateway and Device Provisioning Service depend on certificates validate
If a primary cluster failure occurs and you cannot dynamically modify the endpoint on the devices, they will be unable to establish a connection with the hub. Devices are set up with a single endpoint to link with either the CoAP-Gateway or the Device Provisioning Service, which may include an IP address or DNS address. To guarantee connectivity to the secondary cluster, adopt one of the provided options:

* **DNS Address as endpoint**

In case of primary cluster failure, update the DNS record on the DNS server. It is recommended to set the time to live (TTL) of the DNS record to a low value, e.g., 30 minutes.

* **IP Address as endpoint**

![load-balancer](/docs/features/monitoring-and-diagnostics/static/disaster-recovery-load-balancer.drawio.svg)

Changing the IP address could be challenging in case of primary cluster failure, as the public IP address is often assigned to the Internet Service Provider (ISP). However, using an IP load balancer near devices allows changing the IP address of the load balancer to the secondary cluster. For this, you can use HAProxy, which supports layer 4 load balancing. For more information, refer to the [HAProxy documentation](https://www.haproxy.com/documentation/haproxy-configuration-tutorials/load-balancing/tcp/) and [Failover & Worst Case Management With HAProxy](https://www.haproxy.com/blog/failover-and-worst-case-management-with-haproxy).

* **Update Device Provisioning Service endpoint**

Under these circumstances, you have the option to update the DPS endpoint to the secondary cluster by utilizing the DHCP server to supply the devices with the updated endpoint. The device retrieves a new configuration from the DPS service, obtaining updated:

* Time(optional)
* Owner
* Credentials - Identity certificate, root CA certificate and Pre-shared key(optional)
* Access control lists (ACLs)
* Cloud configuration - Authorization code, Hub ID, Hub URL, etc.

Subsequently, the module connects to the cloud, with the first operation being to sign up for self-registration.

**From the Hub perspective:**
Expand Down
2 changes: 1 addition & 1 deletion content/en/docs/tutorials/dashboard-branding.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Footer has a dedicated component which can be found in [`footer.js`](https://git

## Text changes

Every text in this application is coming from a translation file located in [`langauges.json`](https://github.com/plgd-dev/hub/tree/main/http-gateway/web/src/languages/langauges.json). This object contains a language block for each language you support in your application. If a block is missing you can duplicate an existing block and modify the block with the language code that is missing.
Every text in this application is coming from a translation file located in [`languages.json`](https://github.com/plgd-dev/hub/blob/main/http-gateway/web/src/languages/languages.json). This object contains a language block for each language you support in your application. If a block is missing you can duplicate an existing block and modify the block with the language code that is missing.

Some messages might be missing. This is due to fact that they were not yet translated. You can add them manually or use a language editor like [POEditor](https://poeditor.com/).

Expand Down
21 changes: 20 additions & 1 deletion layouts/index.html
Original file line number Diff line number Diff line change
@@ -1 +1,20 @@
{{ with .GetPage "docs/" }}{{ .Render }}{{ end }}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{{ .Title }}</title>
</head>
<body>
<!-- <header>-->
<!-- <h1>{{ .Title }}</h1>-->
<!-- </header>-->
<main>
<!-- Your home page content goes here -->
{{ with .GetPage "docs/" }}{{ .Render }}{{ end }}
</main>
<footer>
<!-- Footer content goes here -->
</footer>
</body>
</html>
32 changes: 32 additions & 0 deletions scripts/push-to-algolia.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
const algoliasearch = require('algoliasearch');
const fs = require('fs');

// Initialize Algolia client
const client = algoliasearch(process.env.APPLICATION_ID, process.env.API_KEY);
const index = client.initIndex('doc');


process.argv.forEach((val, index) => {
console.log(`args ${index}: ${val}`);
});

// Read JSON file
const jsonData = fs.readFileSync(process.argv[2], 'utf8');

// Parse JSON data
const records = JSON.parse(jsonData);

// Clear the existing index
index.clearObjects()
.then(() => {
console.log('Existing records cleared');
// Add or update records in Algolia index
return index.saveObjects(records, { autoGenerateObjectIDIfNotExist: true });
})
.then(({ objectIDs }) => {
console.log('Records added/updated:', objectIDs);
})
.catch(error => {
console.error('Error adding/updating records:', error);
});

2 changes: 1 addition & 1 deletion themes/plgd
Submodule plgd updated from 6ea8a1 to eb3f6a
Loading

0 comments on commit bbae8a4

Please sign in to comment.