Skip to content
This repository was archived by the owner on Nov 8, 2024. It is now read-only.

Commit bd160d0

Browse files
authored
Merge pull request #13 from apiaryio/greenkeeper/initial
Update dependencies to enable Greenkeeper 🌴
2 parents cd661d8 + befe89d commit bd160d0

File tree

1 file changed

+7
-6
lines changed

1 file changed

+7
-6
lines changed

README.md

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,15 @@
44
[![Build Status](https://travis-ci.org/apiaryio/curl-trace-parser.svg)](https://travis-ci.org/apiaryio/curl-trace-parser)
55
[![Dependency Status](https://david-dm.org/apiaryio/curl-trace-parser.svg)](https://david-dm.org/apiaryio/curl-trace-parser)
66
[![devDependency Status](https://david-dm.org/apiaryio/curl-trace-parser/dev-status.svg)](https://david-dm.org/apiaryio/curl-trace-parser?type=dev)
7+
[![Greenkeeper badge](https://badges.greenkeeper.io/apiaryio/curl-trace-parser.svg)](https://greenkeeper.io/)
78

89

910
## The story
1011

11-
Did you know that you can record raw HTTP communication of [Curl command-line tool](http://curl.haxx.se/docs/manpage.html) with the `--trace` and `--trace-ascii` option? It's the only way I know to get raw HTTP communication without using the [`tcpdump`](http://www.tcpdump.org/) or [`wireshark`](http://www.wireshark.org/).
12+
Did you know that you can record raw HTTP communication of [Curl command-line tool](http://curl.haxx.se/docs/manpage.html) with the `--trace` and `--trace-ascii` option? It's the only way I know to get raw HTTP communication without using the [`tcpdump`](http://www.tcpdump.org/) or [`wireshark`](http://www.wireshark.org/).
1213
For example, this trick is very useful for the proper introspection into HTTP communication of an undocumented RESTful API.
1314

14-
The only glitch is that cURL `--trace` saves data in [its custom format][gist], far from human-friendly, saving chunks as they are being received and splitting them by packets. If you want a human readable form then this parser is what you need. Delivered as a Node.js package.
15+
The only glitch is that cURL `--trace` saves data in [its custom format][gist], far from human-friendly, saving chunks as they are being received and splitting them by packets. If you want a human readable form then this parser is what you need. Delivered as a Node.js package.
1516

1617
[gist]: https://gist.github.com/netmilk/6048533
1718

@@ -34,7 +35,7 @@ $ npm install -g curl-trace-parser
3435
```
3536

3637
## Record your first trace file
37-
38+
3839
```bash
3940
$ curl --trace tracefile --header "Content-Type: application/json" \
4041
--request POST \
@@ -57,7 +58,7 @@ The output is ASCII representation of a raw [HTTP message][message] with few mod
5758
- Request and Response is delimited by CR+LF
5859
- Both Request and Response are terminated by an extra trailing LF
5960

60-
Note: This is little bit tricky because HTTP RFC does not have declared delimiter for Request and Response, for obvious reasons.
61+
Note: This is little bit tricky because HTTP RFC does not have declared delimiter for Request and Response, for obvious reasons.
6162

6263
```bash
6364
$ cat tracefile | curl-trace-parser --raw
@@ -129,7 +130,7 @@ fs.readFile('./tracefile', 'utf8', function (err,trace) {
129130
})
130131
```
131132
132-
## Output format reverse parser
133+
## Output format reverse parser
133134
134135
```javascript
135136
var fs = require('fs');
@@ -145,7 +146,7 @@ fs.readFile('./tracefile', 'utf8', function (err,trace) {
145146
146147
`parseToString(traceString)` - parse string with trace to [output format]
147148
148-
`parseBack(outputString)` - parse string with [output format] back to object with raw request an resposne
149+
`parseBack(outputString)` - parse string with [output format] back to object with raw request an resposne
149150
150151
151152
[output format]: https://github.com/apiaryio/curl-trace-parser#output-format

0 commit comments

Comments
 (0)