Skip to content

Commit d252271

Browse files
committed
HADOOP-18975 fips: yetus review
Change-Id: I2395d5246ab2d46989bf32973b31722df45a7cad
1 parent c54b74c commit d252271

File tree

2 files changed

+8
-8
lines changed

2 files changed

+8
-8
lines changed

hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/connecting.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -129,15 +129,14 @@ See [Timeouts](performance.html#timeouts).
129129
<value>Default_JSSE</value>
130130
<description>
131131
TLS implementation and cipher options.
132-
133132
Values: OpenSSL, Default, Default_JSSE, Default_JSSE_with_GCM
134133

135134
Default_JSSE is not truly the the default JSSE implementation because
136135
the GCM cipher is disabled when running on Java 8. However, the name
137136
was not changed in order to preserve backwards compatibility. Instead,
138137
new mode called Default_JSSE_with_GCM delegates to the default JSSE
139138
implementation with no changes to the list of enabled ciphers.
140-
139+
141140
OpenSSL requires the wildfly JAR on the classpath and a compatible installation of the openssl binaries.
142141
It is often faster than the JVM libraries, but also trickier to
143142
use.
@@ -428,10 +427,10 @@ Before using Access Points make sure you're not impacted by the following:
428427
considering endpoints, if you have any custom signers that use the host endpoint property make
429428
sure to update them if needed;
430429

431-
432430
## <a name="debugging"></a> Debugging network problems
433431

434-
The `storediag` command within the utility [cloudstore](https://github.com/exampleoughran/cloudstore) JAR is recommended as the way to view and print settings.
432+
The `storediag` command within the utility [cloudstore](https://github.com/exampleoughran/cloudstore)
433+
JAR is recommended as the way to view and print settings.
435434

436435
If `storediag` doesn't connect to your S3 store, *nothing else will*.
437436

hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3AEndpointRegion.java

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -178,9 +178,8 @@ public void testWithFips() throws Throwable {
178178
public void testWithFipsAndEndpoint() throws Throwable {
179179
describe("Create a client with fips and an endpoint");
180180

181-
intercept(IllegalArgumentException.class, ERROR_ENDPOINT_WITH_FIPS,
182-
() ->createS3Client(getConfiguration(),
183-
CENTRAL_ENDPOINT, null, US_EAST_1, true));
181+
intercept(IllegalArgumentException.class, ERROR_ENDPOINT_WITH_FIPS, () ->
182+
createS3Client(getConfiguration(), CENTRAL_ENDPOINT, null, US_EAST_1, true));
184183
}
185184

186185
@Test
@@ -291,7 +290,9 @@ public void beforeExecution(Context.BeforeExecution context,
291290
executionAttributes.getAttribute(AwsExecutionAttribute.AWS_REGION).toString())
292291
.describedAs("Incorrect region set").isEqualTo(region);
293292

294-
Assertions.assertThat(executionAttributes.getAttribute(AwsExecutionAttribute.FIPS_ENDPOINT_ENABLED))
293+
// verify the fips state matches expectation.
294+
Assertions.assertThat(executionAttributes.getAttribute(
295+
AwsExecutionAttribute.FIPS_ENDPOINT_ENABLED))
295296
.describedAs("Incorrect FIPS flag set").isEqualTo(isFips);
296297

297298
// We don't actually want to make a request, so exit early.

0 commit comments

Comments
 (0)