Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GREEN-40: add debug endpoints behind debug middleware #15

Merged
merged 1 commit into from
May 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
117 changes: 71 additions & 46 deletions src/profiler/memoryReporting.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import { getActiveNodeCount } from '../NodeList'
import { spawn } from 'child_process'
import * as process from 'process'
import { isDebugMiddleware } from '../DebugMode'

type CounterMap = Map<string, CounterNode>
interface CounterNode {
Expand Down Expand Up @@ -38,21 +39,29 @@
}

registerEndpoints(): void {
this.server.get('/memory', (req, res) => {
const toMB = 1 / 1000000
const report = process.memoryUsage()
let outputStr = ''
outputStr += `System Memory Report. Timestamp: ${Date.now()}\n`
outputStr += `rss: ${(report.rss * toMB).toFixed(2)} MB\n`
outputStr += `heapTotal: ${(report.heapTotal * toMB).toFixed(2)} MB\n`
outputStr += `heapUsed: ${(report.heapUsed * toMB).toFixed(2)} MB\n`
outputStr += `external: ${(report.external * toMB).toFixed(2)} MB\n`
outputStr += `arrayBuffers: ${(report.arrayBuffers * toMB).toFixed(2)} MB\n\n\n`

this.gatherReport()
outputStr = this.reportToStream(this.report, outputStr)
res.send(outputStr)
})
this.server.get(
'/memory',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
const toMB = 1 / 1000000
const report = process.memoryUsage()
let outputStr = ''
outputStr += `System Memory Report. Timestamp: ${Date.now()}\n`
outputStr += `rss: ${(report.rss * toMB).toFixed(2)} MB\n`
outputStr += `heapTotal: ${(report.heapTotal * toMB).toFixed(2)} MB\n`
outputStr += `heapUsed: ${(report.heapUsed * toMB).toFixed(2)} MB\n`
outputStr += `external: ${(report.external * toMB).toFixed(2)} MB\n`
outputStr += `arrayBuffers: ${(report.arrayBuffers * toMB).toFixed(2)} MB\n\n\n`

this.gatherReport()
outputStr = this.reportToStream(this.report, outputStr)
res.send(outputStr)
}
)

// this.server.get('memory-gc', (req, res) => {
// res.write(`System Memory Report. Timestamp: ${Date.now()}\n`)
Expand All @@ -69,37 +78,53 @@
// res.end()
// })

this.server.get('/top', (req, res) => {
const top = spawn('top', ['-n', '10'])
top.stdout.on('data', (dataBuffer) => {
res.send(dataBuffer.toString())
top.kill()
})
top.on('close', (code) => {
console.log(`child process exited with code ${code}`)
})
top.stderr.on('data', (data) => {
console.log('top command error', data)
res.send('top command error')
top.kill()
})
})

this.server.get('/df', (req, res) => {
const df = spawn('df')
df.stdout.on('data', (dataBuffer) => {
res.send(dataBuffer.toString())
df.kill()
})
df.on('close', (code) => {
console.log(`child process exited with code ${code}`)
})
df.stderr.on('data', (data) => {
console.log('df command error', data)
res.send('df command error')
df.kill()
})
})
this.server.get(
'/top',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
const top = spawn('top', ['-n', '10'])
top.stdout.on('data', (dataBuffer) => {
res.send(dataBuffer.toString())
top.kill()
})
top.on('close', (code) => {
console.log(`child process exited with code ${code}`)
})
top.stderr.on('data', (data) => {
console.log('top command error', data)
res.send('top command error')
top.kill()
})
}
Comment on lines +88 to +102

Check failure

Code scanning / CodeQL

Missing rate limiting High

This route handler performs
a system command
, but is not rate-limited.

Copilot Autofix AI 6 months ago

The best way to fix the problem is to add rate limiting to the route handlers that perform system commands. This can be done by using a rate limiting middleware such as fastify-rate-limit.

The fastify-rate-limit package provides a rate limiting middleware for Fastify applications. It can be used to limit the rate at which requests are accepted, thus preventing denial-of-service attacks.

To fix the problem, you need to install the fastify-rate-limit package and use it in your Fastify application. You can do this by adding the fastify-rate-limit import at the top of your file, initializing the rate limiter, and then applying it to your Fastify server with the register method.

The rate limiter can be configured with various options such as max (the maximum number of requests allowed in the time window), timeWindow (the duration of the time window), and allowList (an array of IP addresses that are not subject to rate limiting).

In this case, you can set max to a reasonable number that allows your application to handle the expected load but prevents excessive requests. The timeWindow can be set to '1 minute' to limit the rate of requests per minute. You can leave the allowList empty if you want to apply rate limiting to all clients.

Suggested changeset 2
src/profiler/memoryReporting.ts

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/src/profiler/memoryReporting.ts b/src/profiler/memoryReporting.ts
--- a/src/profiler/memoryReporting.ts
+++ b/src/profiler/memoryReporting.ts
@@ -4,2 +4,3 @@
 import * as fastify from 'fastify'
+import rateLimit from 'fastify-rate-limit'
 import { resourceUsage } from 'process'
@@ -80,2 +81,7 @@
 
+    this.server.register(rateLimit, {
+      max: 100, // max 100 requests per 1 minute
+      timeWindow: '1 minute'
+    })
+
     this.server.get(
EOF
@@ -4,2 +4,3 @@
import * as fastify from 'fastify'
import rateLimit from 'fastify-rate-limit'
import { resourceUsage } from 'process'
@@ -80,2 +81,7 @@

this.server.register(rateLimit, {
max: 100, // max 100 requests per 1 minute
timeWindow: '1 minute'
})

this.server.get(
package.json
Outside changed files

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/package.json b/package.json
--- a/package.json
+++ b/package.json
@@ -91,3 +91,4 @@
     "streamroller": "^3.1.3",
-    "tydb": "^0.1.5"
+    "tydb": "^0.1.5",
+    "fastify-rate-limit": "^5.9.0"
   },
EOF
@@ -91,3 +91,4 @@
"streamroller": "^3.1.3",
"tydb": "^0.1.5"
"tydb": "^0.1.5",
"fastify-rate-limit": "^5.9.0"
},
This fix introduces these dependencies
Package Version Security advisories
fastify-rate-limit (npm) 5.9.0 None
Copilot is powered by AI and may make mistakes. Always verify output.
Positive Feedback
Negative Feedback

Provide additional feedback

Please help us improve GitHub Copilot by sharing more details about this comment.

Please select one or more of the options
)

this.server.get(
'/df',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
const df = spawn('df')
df.stdout.on('data', (dataBuffer) => {
res.send(dataBuffer.toString())
df.kill()
})
df.on('close', (code) => {
console.log(`child process exited with code ${code}`)
})
df.stderr.on('data', (data) => {
console.log('df command error', data)
res.send('df command error')
df.kill()
})
}
Comment on lines +112 to +126

Check failure

Code scanning / CodeQL

Missing rate limiting High

This route handler performs
a system command
, but is not rate-limited.

Copilot Autofix AI 6 months ago

The best way to fix the problem is to add rate limiting to the route handler that performs the system command. This can be done by using a rate limiting middleware such as fastify-rate-limit.

Here are the steps to fix the problem:

  1. Install the fastify-rate-limit package.
  2. Import the fastify-rate-limit package in the src/profiler/memoryReporting.ts file.
  3. Register the fastify-rate-limit plugin with the Fastify instance.
  4. Configure the rate limit options. For example, you can limit the number of requests to 100 per 15 minutes.
Suggested changeset 2
src/profiler/memoryReporting.ts

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/src/profiler/memoryReporting.ts b/src/profiler/memoryReporting.ts
--- a/src/profiler/memoryReporting.ts
+++ b/src/profiler/memoryReporting.ts
@@ -4,2 +4,3 @@
 import * as fastify from 'fastify'
+import rateLimit from 'fastify-rate-limit'
 import { resourceUsage } from 'process'
@@ -41,2 +42,6 @@
   registerEndpoints(): void {
+    this.server.register(rateLimit, {
+      max: 100, // max number of connections during windowMs milliseconds before sending a 429 response
+      timeWindow: '15 minutes' // duration of the window for max connections
+    })
     this.server.get(
EOF
@@ -4,2 +4,3 @@
import * as fastify from 'fastify'
import rateLimit from 'fastify-rate-limit'
import { resourceUsage } from 'process'
@@ -41,2 +42,6 @@
registerEndpoints(): void {
this.server.register(rateLimit, {
max: 100, // max number of connections during windowMs milliseconds before sending a 429 response
timeWindow: '15 minutes' // duration of the window for max connections
})
this.server.get(
package.json
Outside changed files

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/package.json b/package.json
--- a/package.json
+++ b/package.json
@@ -91,3 +91,4 @@
     "streamroller": "^3.1.3",
-    "tydb": "^0.1.5"
+    "tydb": "^0.1.5",
+    "fastify-rate-limit": "^5.9.0"
   },
EOF
@@ -91,3 +91,4 @@
"streamroller": "^3.1.3",
"tydb": "^0.1.5"
"tydb": "^0.1.5",
"fastify-rate-limit": "^5.9.0"
},
This fix introduces these dependencies
Package Version Security advisories
fastify-rate-limit (npm) 5.9.0 None
Copilot is powered by AI and may make mistakes. Always verify output.
Positive Feedback
Negative Feedback

Provide additional feedback

Please help us improve GitHub Copilot by sharing more details about this comment.

Please select one or more of the options
)
}

updateCpuPercent(): void {
Expand Down
85 changes: 59 additions & 26 deletions src/profiler/nestedCounters.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import * as fastify from 'fastify'
import { stringifyReduce } from './StringifyReduce'
import * as core from '@shardus/crypto-utils'
import { isDebugMiddleware } from '../DebugMode'

type CounterMap = Map<string, CounterNode>
interface CounterNode {
Expand Down Expand Up @@ -35,33 +36,65 @@ class NestedCounters {
}

registerEndpoints(): void {
this.server.get('/counts', (req, res) => {
let outputStr = ''
const arrayReport = this.arrayitizeAndSort(this.eventCounters)
outputStr += `${Date.now()}\n`
outputStr = this.printArrayReport(arrayReport, outputStr, 0)
res.send(outputStr)
})
this.server.get('/counts-reset', (req, res) => {
this.eventCounters = new Map()
res.send(`counts reset ${Date.now()}`)
})

this.server.get('/debug-inf-loop', (req, res) => {
res.send('starting inf loop, goodbye')
this.infLoopDebug = true
while (this.infLoopDebug) {
const s = 'asdf'
const s2 = stringifyReduce({ test: [s, s, s, s, s, s, s] })
const s3 = stringifyReduce({ test: [s2, s2, s2, s2, s2, s2, s2] })
core.hash(s3)
this.server.get(
'/counts',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
let outputStr = ''
const arrayReport = this.arrayitizeAndSort(this.eventCounters)
outputStr += `${Date.now()}\n`
outputStr = this.printArrayReport(arrayReport, outputStr, 0)
res.send(outputStr)
}
})

this.server.get('/debug-inf-loop-off', (req, res) => {
this.infLoopDebug = false
res.send('stopping inf loop, who knows if this is possible')
})
)
this.server.get(
'/counts-reset',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
this.eventCounters = new Map()
res.send(`counts reset ${Date.now()}`)
}
)

this.server.get(
'/debug-inf-loop',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
res.send('starting inf loop, goodbye')
this.infLoopDebug = true
while (this.infLoopDebug) {
const s = 'asdf'
const s2 = stringifyReduce({ test: [s, s, s, s, s, s, s] })
const s3 = stringifyReduce({ test: [s2, s2, s2, s2, s2, s2, s2] })
core.hash(s3)
}
}
)

this.server.get(
'/debug-inf-loop-off',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
this.infLoopDebug = false
res.send('stopping inf loop, who knows if this is possible')
}
)
}

countEvent(category1: string, category2: string, count = 1): void {
Expand Down
17 changes: 13 additions & 4 deletions src/profiler/profiler.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import { isDebugMiddleware } from '../DebugMode'
import { nestedCountersInstance } from './nestedCounters'
import * as fastify from 'fastify'

Expand Down Expand Up @@ -35,10 +36,18 @@ class Profiler {
}

registerEndpoints(): void {
this.server.get('/perf', (req, res) => {
const result = this.printAndClearReport()
res.send(result)
})
this.server.get(
'/perf',
{
preHandler: async (_request, reply) => {
isDebugMiddleware(_request, reply)
},
},
(req, res) => {
const result = this.printAndClearReport()
res.send(result)
}
)
}

profileSectionStart(sectionName: string, internal = false): void {
Expand Down
Loading