An open API service indexing awesome lists of open source software.

https://github.com/salesforce-misc/ReVoman


https://github.com/salesforce-misc/ReVoman

Last synced: about 2 months ago
JSON representation

Awesome Lists containing this project

README

        

= ReṼoman (Rev-Woman)
Gopal S Akshintala
:Revision: 1.0
ifdef::env-github[]
:tip-caption: :bulb:
:note-caption: :information_source:
:important-caption: :heavy_exclamation_mark:
:caution-caption: :fire:
:warning-caption: :warning:
endif::[]
:hide-uri-scheme:
:toc:
:toc-placement!:
:figure-caption!:
:sourcedir: src/main/kotlin
:testdir: src/test/java
:integrationtestdir: src/integrationTest/java
:pmtemplates: src/integrationTest/resources/pm-templates
:imagesdir: docs/images
:prewrap!:
:revoman-version: 0.5.7

'''

*ReṼoman* is an API automation tool for JVM (Java/Kotlin) from the API-first SaaS company, *Salesforce*. It re-imagines API automation by letting you execute a Postman collection in a JVM program/test.

'''

[.lead]
To start with, think of it as Postman for JVM (Java/Kotlin);
that emulates the https://learning.postman.com/docs/collections/running-collections/intro-to-collection-runs/[**Postman Collection Runner**] through a Java program,
essentially translating your manual testing into Automation, without any loss or resistance.
But it's even better!

image::postman-run.png[]
image::manual-to-automation.png[Manual to Automation, align="center"]

[.lead]
It strikes a balance between _flexibility_
provided by low-level tools like https://rest-assured.io/[**REST Assured**] or https://cucumber.io/[**Cucumber**] and _ease of use_
provided by UI tools like https://www.postman.com/[**Postman**]

image::hybrid-tool.png[]

== Artifact

[.lead]
Maven
[source,xml,subs=attributes+]
----

com.salesforce.revoman
revoman
{revoman-version}

----
[.lead]
Bazel
[source,bzl,subs=attributes+]
----
"com.salesforce.revoman:revoman"
----
[.lead]
Gradle Kts
[source,kts,subs=attributes+]
----
implementation("com.salesforce.revoman:revoman:{revoman-version}")
----

[.lead]
Minimum Java Version required = 17

toc::[]

.Tech-Talk given at https://www.opensourceindia.in/osi-speakers-2023/gopala-sarma-akshintala/[Open Source Conf—2023], https://speakerdeck.com/gopalakshintala/revoman-a-template-driven-api-automation-tool-for-jvm[🎴Slide deck]
image::revoman-demo-thumbnail.png[link="https://www.youtube.com/watch?v=YxeRddSFkxc&list=PLrJbJ9wDl9EC0bG6y9fyDylcfmB_lT_Or&index=2", align="center"]

== Why ReṼoman?

=== The Problem

* The majority of JVM SaaS applications are REST-based. But the API automation is done through a Mul-*T*-verse of Integration/Functional tests, E2E tests and Manual tests, each with its own frameworks, tools, and internal utilities, testing almost the same code flow.
* These custom alien automation frameworks, often built using low-level tools like https://rest-assured.io/[**REST Assured**], are specific to a service or domain and are rigid to reuse, extend and difficult to maintain.
* This automation competes on cognitive complexity and learning curve with the Prod code, and mostly, automation wins.
* After a point, the API automation may deviate from its purpose of augmenting real end-user interaction and turns into a foot-chain for development.

image::cognitive-complexity.png[align="center"]

=== The Solution

Contrary to these custom frameworks,
almost every team uses https://www.postman.com/product/what-is-postman[*Postman*] for manual testing their APIs.
Postman collections contain a lot of information about your APIs and the order
in which they need to be executed for manual testing,
in a https://www.postman.com/templates/[Structured Template].
Leveraging it can mitigate writing a lot of code as we translate those manual steps into automation.

____

* How _productive_ would it be, if you could plug your exported Postman collection template,
that you anyway would have created for your manual testing and executed through your JVM tests?

* How about a Universal API automation tool that promotes low code and low-cognitive-complexity and strikes a balance between flexibility and ease of use?

____

== API automation with _ReṼoman_

=== Template-Driven Testing

* The exported Postman collection JSON file is referred to as a Postman template, as it contains some placeholders/variables in the `+{{variable-key}}+` pattern. You can read more about it https://learning.postman.com/docs/sending-requests/variables/[here]
* ReṼoman understands these templates and replaces these variables at the runtime, similar to Postman. It supports
** Nested variables, e.g., `+{{variable-key{{nested-variable-key}}}}+`
** link:{sourcedir}/com/salesforce/revoman/internal/postman/DynamicVariableGenerator.kt[Dynamic variables], e.g., `{{$randomUUID}}`, `{{$randomEmail}}`
** <>

[NOTE]
====
In the case of collision between variable keys, the precedence order to derive a value to replace any key is:

. Custom Dynamic variables
. Generated Dynamic variables
. Dynamic Environment supplied through config + Environment built during execution + Postman environment supplied as a file through config
====

TIP: For Dynamic variables, which gets populated from Java code, use $ sign within a variable to indicate it needs to be populated during manual testing. E.g.: `{{$unitPrice}}`

[.lead]
You can _kick off_ this *Template-Driven Testing* by invoking `ReVoman.revUp()`,
supplying your Postman templates and environments, and all your customizations through a *Configuration*:

[source,java,indent=0,tabsize=2,options="nowrap"]
----
final var rundown =
ReVoman.revUp(
Kick.configure()
...
.off())
----

=== A Simple Example

Here is a simple link:{pmtemplates}/restfulapidev/restful-api.dev.postman_collection.json[Exported Postman collection] and link:{pmtemplates}/restfulapidev/restful-api.dev.postman_environment.json[Environment],
to hit a free public https://restful-api.dev/[RESTFUL-API].
You can import and manually test this collection through the `Run collection` button like this:

image::resfulapi-dev-pm.png[]

You can automate the same
using ReṼoman in a link:{integrationtestdir}/com/salesforce/revoman/integration/restfulapidev/RestfulAPIDevTest.java[Junit test]
by supplying the template and environment path:

ifdef::env-github[]

[source,java,indent=0,tabsize=2,options="nowrap"]
.link:{integrationtestdir}/com/salesforce/revoman/integration/restfulapidev/RestfulAPIDevTest.java[RestfulAPIDevTest.java, tag=revoman-simple-demo]
----
@Test
@DisplayName("restful-api.dev")
void restfulApiDev() {
final var rundown =
ReVoman.revUp( // <1>
Kick.configure()
.templatePath(PM_COLLECTION_PATH) // <2>
.environmentPath(PM_ENVIRONMENT_PATH) // <3>
.off());
assertThat(rundown.firstUnIgnoredUnsuccessfulStepReport()).isNull(); // <4>
assertThat(rundown.stepReports).hasSize(3); // <5>
}
----
<1> `revUp` is the method to call passing a configuration, built as below
<2> Supply an exported Postman collection JSON file path
<3> Supply an exported Postman environment JSON file path
<4> Assert that the execution doesn't have any failures
<5> Run more assertions on the <>

endif::[]
ifndef::env-github[]

[source,java,indent=0,tabsize=2,options="nowrap"]
.link:{integrationtestdir}/com/salesforce/revoman/integration/restfulapidev/RestfulAPIDevTest.java[RestfulAPIDevTest.java,tag=revoman-simple-demo]
----
include::{integrationtestdir}/com/salesforce/revoman/integration/restfulapidev/RestfulAPIDevTest.java[tag=revoman-simple-demo]
----
<1> `revUp` is the method to call passing a configuration, built as below
<2> Supply an exported Postman collection JSON file path
<3> Supply an exported Postman environment JSON file path
<4> Assert that the execution doesn't have any failures
<5> Run more assertions on the <>

endif::[]

=== Rundown

After all this, you receive back a detailed *Rundown* in return.
It contains everything you need to know about what happened in an execution,
such that you can seamlessly run more assertions on top of the run.

[source,kotlin,indent=0,tabsize=2,options="nowrap"]
----
Rundown(
val stepReports: List,
val mutableEnv: PostmanEnvironment)

StepReport(
step: Step,
requestInfo: Either>? = null, // <1>
preStepHookFailure: PreStepHookFailure? = null,
responseInfo: Either>? = null,
postStepHookFailure: PostStepHookFailure? = null,
envSnapshot: PostmanEnvironment // <2>
)
----
<1> https://docs.vavr.io/#_either[`Either` type from the VAVR] library represents either of the two states, used here to represent error or success
<2> Snapshot of Environment at the end of each step execution. It can be compared with previous or next step environment snapshots to see what changed in this step

[.lead]
`Rundown` has many convenient methods to ease applying further assertions on top of it.

TIP: Other simple examples to see in Action: link:{integrationtestdir}/com/salesforce/revoman/integration/pokemon/PokemonTest.java[PokemonTest.java]

=== Advanced Example

[.lead]
ReṼoman isn't just limited to executing Collection like Postman;
you can add more _bells and whistles_ 🔔:

ifdef::env-github[]

[source,java,indent=0,tabsize=2,options="nowrap"]
.link:{integrationtestdir}/com/salesforce/revoman/integration/core/pq/PQE2EWithSMTest.java[PQE2EWithSMTest.java, tag=pq-e2e-with-revoman-config-demo]
----
final var pqRundown =
ReVoman.revUp( // <1>
Kick.configure()
.templatePaths(PQ_TEMPLATE_PATHS) // <2>
.environmentPath(PQ_ENV_PATH) // <3>
.dynamicEnvironment( // <4>
Map.of(
"$quoteFieldsToQuery", "LineItemCount, CalculationStatus",
"$qliFieldsToQuery", "Id, Product2Id",
"$qlrFieldsToQuery", "Id, QuoteId, MainQuoteLineId, AssociatedQuoteLineId"))
.customDynamicVariableGenerator( // <5>
"$unitPrice",
(ignore1, ignore2, ignore3) -> String.valueOf(Random.Default.nextInt(999) + 1))
.nodeModulesRelativePath("js") // <6>
.haltOnFailureOfTypeExcept(
HTTP_STATUS,
afterAllStepsContainingHeader("ignoreHTTPStatusUnsuccessful")) // <7>
.requestConfig( // <8>
unmarshallRequest(
beforeStepContainingURIPathOfAny(PQ_URI_PATH),
PlaceQuoteInputRepresentation.class,
adapter(PlaceQuoteInputRepresentation.class)))
.responseConfig( // <9>
unmarshallResponse(
afterStepContainingURIPathOfAny(PQ_URI_PATH),
PlaceQuoteOutputRepresentation.class),
unmarshallResponse(
afterStepContainingURIPathOfAny(COMPOSITE_GRAPH_URI_PATH),
CompositeGraphResponse.class,
CompositeGraphResponse.ADAPTER))
.hooks( // <10>
pre(
beforeStepContainingURIPathOfAny(PQ_URI_PATH),
(step, requestInfo, rundown) -> {
if (requestInfo.containsHeader(IS_SYNC_HEADER)) {
LOGGER.info("This is a Sync step: {}", step);
}
}),
post(
afterStepContainingURIPathOfAny(PQ_URI_PATH),
(stepReport, ignore) -> {
validatePQResponse(stepReport); // <11>
final var isSyncStep =
stepReport.responseInfo.get().containsHeader(IS_SYNC_HEADER);
if (!isSyncStep) {
LOGGER.info(
"Waiting in PostHook of the Async Step: {}, for the Quote's Asynchronous processing to finish",
stepReport.step);
// ! CAUTION 10/09/23 gopala.akshintala: This can be flaky until
// polling is implemented
Thread.sleep(5000);
}
}),
post(
afterStepContainingURIPathOfAny(COMPOSITE_GRAPH_URI_PATH),
(stepReport, ignore) -> validateCompositeGraphResponse(stepReport)),
post(
afterStepName("query-quote-and-related-records"),
(ignore, rundown) -> assertAfterPQCreate(rundown.mutableEnv)))
.globalCustomTypeAdapter(new IDAdapter()) // <12>
.insecureHttp(true) // <13>
.off()); // Kick-off
assertThat(pqRundown.firstUnIgnoredUnsuccessfulStepReport()).isNull(); // <14>
assertThat(pqRundown.mutableEnv)
.containsAtLeastEntriesIn(
Map.of(
"quoteCalculationStatusForSkipPricing", PricingPref.Skip.completeStatus,
"quoteCalculationStatus", PricingPref.System.completeStatus,
"quoteCalculationStatusAfterAllUpdates", PricingPref.System.completeStatus));
----
<1> `revUp()` is the method to call passing a configuration, built as below
<2> Supply the path (relative to resources) to the Template Collection JSON file/files
<3> Supply the path (relative to resources) to the Environment JSON file/files
<4> Supply any dynamic environment that is runtime-specific
<5> <>
<6> Node modules path (relative to resources) to be used inside <>
<7> <>
<8> <<#_type_safety_with_flexible_json_pojo_marshallingserialization_and_unmarshallingdeserialization, Request Config>>
<9> <<#_type_safety_with_flexible_json_pojo_marshallingserialization_and_unmarshallingdeserialization, Response Config>>
<10> <<#_pre_step_and_post_step_hooks>>
<11> <>
<12> <<#_type_safety_with_flexible_json_pojo_marshallingserialization_and_unmarshallingdeserialization, Global Custom Type Adapters>>
<13> Ignore Java cert issues when firing HTTP calls
<14> Run more assertions on the <>

endif::[]
ifndef::env-github[]

[source,java,indent=0,tabsize=2,options="nowrap"]
.link:{integrationtestdir}/com/salesforce/revoman/integration/core/pq/PQE2EWithSMTest.java[PQE2ETest.java, tag=pq-e2e-with-revoman-config-demo]
----
include::{integrationtestdir}/com/salesforce/revoman/integration/core/pq/PQE2EWithSMTest.java[tag=pq-e2e-with-revoman-config-demo]
----
<1> `revUp()` is the method to call passing a configuration, built as below
<2> Supply the path (relative to resources) to the Template Collection JSON file/files
<3> Supply the path (relative to resources) to the Environment JSON file/files
<4> Supply any dynamic environment that is runtime-specific
<5> <>
<6> Node modules path (relative to resources) to be used inside <>
<7> <>
<8> <<#_type_safety_with_flexible_json_pojo_marshallingserialization_and_unmarshallingdeserialization, Request Config>>
<9> <<#_type_safety_with_flexible_json_pojo_marshallingserialization_and_unmarshallingdeserialization, Response Config>>
<10> <<#_pre_step_and_post_step_hooks>>
<11> <>
<12> <<#_type_safety_with_flexible_json_pojo_marshallingserialization_and_unmarshallingdeserialization, Global Custom Type Adapters>>
<13> Ignore Java cert issues when firing HTTP calls
<14> Run more assertions on the <>
endif::[]

== Reuse Config

You can define a base common config and reuse it by overriding certain properties using the `override...()` methods,
which are present for each config attribute

== Debugging UX

[.lead]
This tool has a particular emphasis on the Debugging experience. Here is what a debugger view of a <> looks like:

image:rundown.png[Rundown of all steps]

[.lead]
🔍 Let's zoom into a detailed view of one of those Step reports, which contains complete Request and Response info along with failure information if any:

image:step-report.png[Step Report]

[.lead]
Here are the environment *key-value* pairs accumulated along the entire execution and appended to the environment from the file and `dynamicEnvironment` supplied:

image:mutable-env.png[Mutable environment after the execution completion]

[.lead]
If something goes wrong at any stage during the Step execution, ReṼoman *fails-fast* and captures the `Failure` in StepReport:

image:step-execution.png[Step Execution]

[.lead]
Here is the failure hierarchy of what can go wrong in this process

image::failure-hierarchy.png[Failure Hierarchy]

[.lead]
ReṼoman logs all the key operations that happen inside its source-code,
including how the environment variables are being mutated by a step in its <>.
Watch your console to check what's going on in the execution or troubleshoot from CI/CD logs

NOTE: link:docs/revoman.exe.log[📝Sample log] printed during execution

image:pq-exe-logging.gif[Monitor Execution]

== Features

[#_type_safety_with_flexible_json_pojo_marshallingserialization_and_unmarshallingdeserialization]
=== Type Safety with flexible JSON <- -> POJO marshalling/serialization and unmarshalling/deserialization

Postman operates purely with JSON.
When interoperating Postman with JVM,
unmarshalling/deserialization of JSON into a POJO and vice versa helps in writing Type-safe JVM code and enhances the debugging experience on JVM.
ReṼoman internally uses a modern JSON library called https://github.com/square/moshi[**Moshi**].
Simple types whose JSON structure aligns with the POJO data structure can directly be converted.
But when the JSON structure may not align with the POJO,
you may need a _Custom Type Adapter_ for Marshalling to JSON or Unmarshalling from JSON.
Moshi has it covered for you with its advanced adapter mechanism and ReṼoman accepts Moshi adapters.
Checkout these methods that help us interop between Postman and JVM:

==== `globalSkipTypes()`

There may be a POJO that inherits or contains legacy types that are hard or impossible to serialize.
ReṼoman lets you serialize only types that matter, through `globalSkipTypes`,
where you can filter out these legacy types from Marshalling/Unmarshalling

==== `requestConfig()`

* Configure Moshi adapters to unmarshall/deserialize _Request_ JSON payload to a POJO on certain steps.
* You may use the bundled static factory methods named `RequestConfig.unmarshallResponse()` for expressiveness.
* You can pass a `PreTxnStepPick` which is a `Predicate` used
to qualify a step whose Request JSON payload needs to be unmarshalled/deserialized.
* If you have set up `requestConfig()` once, wherever you wish to read the request in your <<#_pre_step_and_post_step_hooks,Pre-Step Hooks>>, you can call `stepReport.requestInfo.get().getTypedTxnObj()` which returns your request JSON as a Strong type.
* If you don't configure it for a step, Moshi unmarshalls the step request JSON into default data structures like `LinkedHashMap`

==== `responseConfig()`

* Configure Moshi adapters to unmarshall/deserialize _Response_ JSON payload to a POJO on certain steps.
* You can configure separate adapters for success and error response. Success or Error response is determined by default with HTTP Status Code (SUCCESSFUL: `200 < = statusCode < = 299`).
* Use bundled static factory methods like `ResponseConfig.unmarshallSuccessResponse()` and `ResponseConfig.unmarshallErrorResponse()` for expressiveness.
* You can pass a `PostTxnStepPick` which is a `Predicate` used
to qualify a step whose Response JSON payload needs to be unmarshalled/deserialized.
* If you have set up `responseConfig()` once, wherever you wish to read or assert the response in your <<#_pre_step_and_post_step_hooks,Post-Step Hooks>>, you can call `stepReport.responseInfo.get().getTypedTxnObj()` which returns your response JSON as a Strong type to conveniently assert.
* If you don't configure it for a step, Moshi unmarshalls the Step response JSON into default data structures like `LinkedHashMap`

[TIP]
====
* There may be a scenario
that you cannot depend on HTTP Status Code to distinguish between Success or Error.
* In such a case,
you can leverage a bundled Moshi factory link:{sourcedir}/com/salesforce/revoman/input/json/factories/DiMorphicAdapter.kt[DiMorphicAdapter]
that deals with it dynamically based on a `boolean` attribute value in the response JSON.
See `DiMorphicAdapter` in action from the test `compositeGraphResponseDiMorphicMarshallUnmarshall()` in link:{testdir}/com/salesforce/revoman/input/json/JsonPojoUtilsTest.java[JsonPojoUtilsTest].
* You may also refer to link:https://square.github.io/moshi/1.x/moshi-adapters/adapters/com.squareup.moshi.adapters/-polymorphic-json-adapter-factory/index.html[PolymorphicJsonAdapterFactory] for Marshalling/Unmarshalling based on `String` type attribute.
====

==== `globalCustomTypeAdapters()`

* These come handy when the same POJO/Data structure (e.g `ID`) is present across multiple Request or Response POJOs. These adapters compliment the custom adapters setup in `requestConfig()` or `responseConfig()` wherever these types are present.
* But these adapters won't be used to marshall/unmarshall before or after each step execution. Which means you won't be able to Strong types in the debug view. Although, you can get them on-demand using the respective `getTypedObj()` method.

==== JSON Reader/Writer Utils to build Moshi adapters

ReṼoman also comes
bundled with link:{sourcedir}/com/salesforce/revoman/input/json/JsonReaderUtils.kt[JSON Reader utils] and link:{sourcedir}/com/salesforce/revoman/input/json/JsonWriterUtils.kt[JSON Writer utils]
to help build Moshi adapters.

TIP: Refer link:{integrationtestdir}/com/salesforce/revoman/integration/core/adapters/ConnectInputRepWithGraphAdapter.java[ConnectInputRepWithGraphAdapter] on how these utils come in handy in building an advanced Moshi adapter

==== JSON POJO Utils

The bundled link:{sourcedir}/com/salesforce/revoman/input/json/JsonPojoUtils.kt[JSON POJO Utils] can be used to directly to convert JSON to POJO and vice versa.

[TIP]
====
See in Action:

* link:{testdir}/com/salesforce/revoman/input/json/JsonPojoUtilsTest.java[JsonPojoUtilsTest]
* link:{integrationtestdir}/com/salesforce/revoman/input/json/JsonPojoUtils2Test.java[JsonPojoUtils2Test]
====

=== Execution Control

The configuration offers methods through which the execution strategy can be controlled without making any changes to the template:

* `haltOnAnyFailure` — This defaults to `false`. If set to `true`, the execution fails-fast when it encounters a failure.
* `haltOnFailureOfTypeExcept` — This accepts pairs of `ExeType` and a `PostTxnStepPick` which are used to check if a Step can be ignored for failure for a specific failure type
* `runOnlySteps`, `skipSteps` — All these accept a `predicate` of type `ExeStepPick`, which is invoked passing the current `Step` instance to decide whether to execute or skip a step.
** There are some `ExeStepPick` predicates bundled with ReṼoman under `ExeStepPick.PickUtils` e.g `withName`, `inFolder` etc. You can write a custom predicate of your own too.

[#_pre_step_and_post_step_hooks]
=== Pre-Step and Post-Step Hooks

A hook lets you fiddle with the execution by plugging in your custom JVM code before or after a Step execution.

[#_step_picks]
You can pass a `PreTxnStepPick/PostTxnStepPick` which is a `Predicate` used
to qualify a step for Pre-Step/Post-Step Hook respectively.
ReṼoman comes
bundled with some predicates under the namespace `PreTxnStepPick.PickUtils`/`PostTxnStepPick.PickUtils` e.g `beforeStepContainingURIPathOfAny()`,
`afterStepName()` etc. If those don't fit your needs, you can write your own custom predicates like below:

[source,java,indent=0,tabsize=2,options="nowrap"]
----
final var preTxnStepPick = (currentStep, requestInfo, rundown) -> LOGGER.info("Picked `preLogHook` before stepName: {}", currentStep)
final var postTxnStepPick = (stepReport, rundown) -> LOGGER.info("Picked `postLogHook` after stepName: {}", stepReport.step.displayName)
----

Add them to the config as below:

[source,java,indent=0,tabsize=2,options="nowrap"]
----
.hooks(
pre(
preTxnStepPick,
(currentStepName, requestInfo, rundown) -> {
//...code...
}),
post(
postTxnStepPick,
(currentStepName, rundown) -> {
//...code...
})
)
----

* You can do things like assertion on the rundown, <<#_response_validations,Response Validations>>,
or even <<#_mutable_environment,mutate the environment>> with a value you programmatically derived,
such that the execution of later steps picks up those changes.
* Reserve hooks for plugging in your custom code or asserting and fail-fast in the middle of execution.
If your assertions can wait till the final rundown,
it's cleaner to write them after the `revUp()` returns the rundown instead of adding hooks for each step

[#_plug_in_your_java_code_in_between_postman_execution]
==== Plug-in your Java code in-between Postman execution

You can plug in your java code
to create/generate values for environment variables
which can be populated and picked-up by subsequent steps.
For example, you may want some `xyzId` but you don't have a Postman collection to create it.
Instead, you have a Java utility to generate/create it.
You can invoke the utility in a pre-hook of a step and set the value in `rundown.mutableEnv`,
so the later steps can pick up value for `+{{xyzId}}+` variable from the environment.

[#_response_validations]
==== Response Validations

* Post-Hooks are the best place to validate response right after the step.
* If you have configured a strong type for your response through `responseConfig`, you can write type-safe validations by extracting your Strong type Object using `stepReport.responseInfo.get().getTypedTxnObj()` (if you have configured `responseConfig()` or `globalCustomTypeAdapters()`) or use `JsonPojoUtils.jsonToPojo(TypeT, stepReport.responseInfo.get().httpMsg.bodyString())` to convert it inline.
* If your response data structure is non-trivial and has requirements to execute validations with different strategies like `fail-fast` or `error-accumulation`, consider using a library like https://github.com/salesforce-misc/Vador[*Vador*]

=== Pre-req and Post-res scripts

* Postman lets you write custom JavaScript in https://learning.postman.com/docs/writing-scripts/script-references/test-examples/[Pre-req and Post-res tabs] that get executed before and after a step respectively. When you export the collection as a template, these scripts also come bundled.
* ReṼoman can execute this JavaScript on JVM. This support ensures that the Postman collection used for manual testing can be used *as-is* for the automation also, without any resistance to modify or overhead of maintaining separate versions for manual and automation.
** Pre-req JS script is executed as the first step before Unmarshall request.
** Post-res JS script is executed right after receiving an HTTP response.
* ReṼoman supports using `npm` modules inside your Pre-req and Post-res JS scripts. You can install `npm` modules in any folder using traditional commands like `npm install ` and supply in the `Kick` config, the relative path of the parent folder that contains the `node_modules` folder using `nodeModulesRelativePath(...)`. Use those `npm` modules inside your scripts with `require(...)`, for example:

.Install `moment` with npm
[source,shellscript,indent=0,options="nowrap"]
----
npm install moment
----

.Use inside pre-req and post-res scripts
[source,javascript,indent=0,tabsize=2,options="nowrap"]
----
var moment = require("moment")
var _ = require('lodash')

pm.environment.set("$currentDate", moment().format(("YYYY-MM-DD")))
var futureDateTime = moment().add(365, 'days')
pm.environment.set('$randomFutureDate', futureDateTime.format('YYYY-MM-DD'))

pm.environment.set("$quantity", _.random(1, 10))
----

[TIP]
====
* `node_modules` Adds a lot of files to check in. You may replace them with a single distribution file

image::node_modules.png[]

* If `node_modules` is ignored on your git repo, you can force-add to check in using the command `git add -all -f /node_modules`
====

CAUTION: The recommendation is not to add too much code in <>, as it's not intuitive to troubleshoot through debugging. Use it for simple operations that can be understood without debugging, and use <<#_pre_step_and_post_step_hooks,Pre-Step /Post-Step Hooks>> for any non-trivial operations, which are intuitive to debug.

[#_mutable_environment]
=== Mutable Environment

* Environment is the only mutable-shared state across step executions, which can be used for data passing between the consumer and the library.
* This can be mutated (set key-value pairs) through <> (using `pm.environment.set()`) and <<#_pre_step_and_post_step_hooks,Pre-Step /Post-Step Hooks>> (using the reference `rundown.mutableEnv`) during execution.

==== Read Mutable Environment as Postman Environment JSON format

You may want to troubleshoot manually with Postman using the Mutable environment built during the ReṼoman execution.
`rundown.mutableEnv.postmanEnvJSONFormat()` converts the mutable environment into a Postman JSON format,
so you can copy and import that environment conveniently into Postman.

TIP: You do NOT need to save the copied Postman environment JSON from the debugger into file.
You can paste (kbd:[Ctrl+v]) directly in the Postman environment window

==== Read Environment value into Strong type

You can read any value from mutableEnv as a Strong type using `rundown.mutableEnv.getTypedObj()`
See it in action: `getTypedObj()`
test from link:{testdir}/com/salesforce/revoman/output/postman/PostmanEnvironmentTest.java[PostmanEnvironmentTest]

==== `pmEnvSnapshot` in each StepReport

Each StepReport also has a `pmEnvSnapshot` to assert if a step has executed as expected and compare snapshots from different steps to examine the execution progress.

=== Compose Modular Executions

* You don't have to squash all your steps into one mega collection. Instead, you can break them into easy-to-manage modular collections. `ReVoman.revUp()` accepts a list of collection paths through `templatePaths()`
* But that doesn't mean you have to execute all these templates in one-go. You can make multiple `ReVoman.revUp()` calls for different collections.
* If you wish to compose these executions in a specific order, you can use the `revUp()` overload which accepts a vararg `Kick` configs.
** You can also achieve the same yourself, by adding the previous execution's `mutableEnv` to the current execution using the `dynamicEnvironment` parameter. This also comes in handy when you wish to execute a common step (e.g. `UserSetup`) inside a Test setup method and use that environment for all the tests.

=== Custom Dynamic variables

If the in-built dynamic variables don't fit your needs, you can plug your own dynamic variable generator
to be invoked to generate a value for your custom variable-key in the template at runtime.

== USP

=== Low-code

TIP: Here is an example of a low-code link:{integrationtestdir}/com/salesforce/revoman/integration/core/pq/PQE2EWithSMTest.java[**E2E test**] that automates *~75 steps*

[.lead]
Compared to a traditional Integration/Functional or E2E test, approximately, the amount of code needed is *89%* less using ReṼoman.
The above test doesn't just have low code, but also low in Cognitive Complexity and Transparent in what it does.

=== Low Learning Curve

Familiarity with Postman gets you a long way in understanding this tool.
Playing with the existing link:{integrationtestdir}[Integration Tests]
and writing a couple of hands-on tests should get you started.

=== CI/CD integrability

* ReṼoman is like any JVM library that you can plug into any JVM program/test (e.g., JUnit tests or Integration tests).
* Apart from adding a dependency in the build tool, there is *no extra setup needed* to execute these tests with ReṼoman in CI/CD.

=== Up-to-date Postman collections that live along with Code in VCS

* A nice side effect is, this lets the Postman collections always stay up to date and the entire Postman collection guards each *check-in* in the form of a Test suite augmenting manual testing.
* Any day, you can find an up-to-date Postman collection for every feature you need to test, right in your VCS (Git) along with your code. Developers can import these templates directly from VCS into Postman for manual testing. This comes in very handy during a team blitz.

=== Unified framework for Automating __Persona-based__ Manual testing

* ReṼoman brings a *Unified & Simplified Test strategy* across the mul-**T**-verse (Integration Tests, E2E Tests, and Manual testing with Postman) for any API.
* The automation stays as close as possible to Persona-based Manual testing, leading to Transparency and better Traceability of issues
* This forces engineers to think like *API-first customers* while writing tests.
* *Test Data setup:* You can use the ReṼoman for the Test data setup too. This eliminates the need for different teams to write their own internal utilities for data setup.
* *E2E Test* can even reside outside the Service repo, as long as it can hit the service API

== Perf

This entire execution of **~75 steps**, including **10 async steps**, took a mere *122 sec* on localhost.
This can be much better in auto-build environments.

image:pq-revoman-test-time.png[Localhost Test time on FTest console for ~75 steps]

WARNING: ReṼoman internally is very light-weight, and the execution times are proportional
to how your server responds or your network speed.

== Future

[.lead]
The future looks bright with multiple impactful features in the pipeline:

* API metrics and Analytics
* *It's built with extensibility* in mind. It can easily be extended to support other template formats, such as Kaiju templates used for availability testing.
* In-built polling support for Async steps
* Payload generation
* Flow control through YAML config

== FAQs

=== How to Debug a step in the middle of an Execution?

* You can add a <<#_pre_step_and_post_step_hooks,pre-hook>> to the Step you are interested and add a debug point inside that. This gets hit before ReṼoman fires the request in that Step
* You can get more adventurous by attaching revoman jar sources and directly adding conditional debug points inside this library source-code. You can search for logs in the source-code that indicate key operations to add conditional debug points with conditions like StepName etc.

=== Is there a way to add Metadata to a Postman collection Step?

* You can add key-value pairs to a Step's HTTP Headers section (e.g., `ignoreHTTPStatusUnsuccessful=true`).
* You can use this information in <<#_step_picks,Step Picks>> or <<#_pre_step_and_post_step_hooks>> to identify a particular step to execute any conditional logic

=== Do I need to migrate all my existing TestUtils to Postman Collections?

You don't have to.
This is a JVM-first tool,
and you can interlace your TestUtils through <<#_plug_in_your_java_code_in_between_postman_execution,Pre-Step/Post-Step Hooks>>

=== Why not use https://learning.postman.com/docs/collections/using-newman-cli/command-line-integration-with-newman[Newman] or https://learning.postman.com/docs/postman-cli/postman-cli-overview/#comparing-the-postman-cli-and-newman[Postman CLI]?

* ReṼoman may be similar to Newman or Postman CLI when it comes to executing a Postman collection, but the _similarities end there_.
* Newman or Postman CLI are built for node and cannot be executed within a JVM. Even if you are able to run in some hacky way, there is no easy way to assert results.
* ReṼoman is JVM first that lets you configure a lot more, and gives you back a detailed report to assert in a typesafe way

== 🙌🏼 Consume-Collaborate-Contribute

* This link:CONTRIBUTING.adoc[CONTRIBUTING] doc has all the information to set up this library on your local and get hands-on.
* Any issues or PRs are welcome! ♥️
* Join this https://sfdc.co/revoman-slack[Slack Community] to discuss issues or PRs related to Consumption-Collaboration-Contribution