docs/data-handling-and-conversion.md
The Terraform AWS Provider codebase bridges the implementation of a Terraform Plugin and an AWS API client to support AWS operations and data types as Terraform Resources. Data handling and conversion is a large portion of resource implementation given the domain-specific implementations of each side of the provider. The first is where Terraform is a generic infrastructure as code tool with a generic data model and the other is where the details are driven by AWS API data modeling concepts. This guide is intended to explain and show preferred Terraform AWS Provider code implementations required to successfully translate data between these two systems.
At the bottom of this documentation is a Glossary section, which may be a helpful reference while reading the other sections.
Before getting into highly specific documentation about the Terraform AWS Provider handling of data, it may be helpful to briefly highlight how Terraform Plugins (Terraform Providers in this case) interact with Terraform CLI and the Terraform State in general and where this documentation fits into the whole process.
There are two primary data flows that are typically handled by resources within a Terraform Provider. Data is either converted from a planned new Terraform State into making a remote system request, referred to as "Expanding", or a remote system response is converted into an applied new Terraform State, referred to as "Flattening". The semantics of how the data of the planned new Terraform State is surfaced to the resource implementation is determined by where a resource is in its lifecycle and is mainly handled by Terraform CLI. This concept can be explored further in the Terraform Resource Instance Change Lifecycle documentation, with the caveat that some additional behaviors occur within the Terraform Plugin SDK as well (if the Terraform Plugin uses that implementation detail).
As a generic walkthrough, the following data handling occurs when creating a Terraform Resource:
terraform applyThe lines in bold above are the focus of this page.
An important behavior to note with Terraform State handling is if the value of a particular root attribute or block is not refreshed during plan or apply operations, then the prior Terraform State is implicitly deep copied to the new Terraform State for that attribute or block.
Given a resource with a writeable root attribute named not_set_attr that never explicitly writes a value, the following happens:
not_set_attr = "anything" on resource creation, the Terraform State contains not_set_attr equal to "anything" after apply.not_set_attr = "updated", the Terraform State contains not_set_attr equal to "updated" after apply.This however does not apply to nested attributes and blocks if the parent block is refreshed.
Given a resource with a root block named parent, with nested child attributes set_attr and not_set_attr, a read operation which updates the value of parent (and the nested set_attr attribute) will not copy the Terraform State for the nested not_set_attr attribute.
There are valid use cases for passthrough attribute values such as these (see the Virtual Attributes section), however the behavior can be confusing or incorrect for operators if the drift detection is expected. Typically these types of drift detection issues can be discovered by implementing resource import testing with state verification.
Perhaps the most distinct difference between Terraform Plugin Framework and Terraform Plugin SDKv2 is data handling.
With Terraform Plugin Framework state data is strongly typed, while Plugin SDK V2 based resources represent state data generically (each attribute is an any) and types must be asserted at runtime.
Strongly typed data eliminates an entire class of runtime bugs and crashes, but does require compile type declarations and a slightly different approach to reading and writing data.
The sections below contain examples for both plugin libraries, but Terraform Plugin Framework is required for all net-new resources.
To expand on the data handling that occurs specifically within the Terraform AWS Provider resource implementations, the above resource creation items become the below in practice:
=== "Terraform Plugin Framework (Preferred)"
- The Create method of a resource is invoked with resource.CreateRequest containing the planned new state data (req.Plan) and an AWS API client (stored in the Meta() method of the resource struct).
- Before reaching this point, the Plan data was already translated from the Terraform Plugin Protocol data types by the Terraform Plugin Framework so values can be read by invoking req.Plan.Get(ctx, &plan), where plan is an instance of the struct representing the resources data.
- An AWS Go SDK operation input type (e.g., ec2.CreateVpcInput) is initialized
- For each necessary field to configure in the operation input type, the data is read from the plan struct and converted into the AWS Go SDK type for the field (e.g., *string)
- The AWS Go SDK operation is invoked and the output type (e.g., *ec2.CreateVpcOutput) is initialized
- For each necessary Attribute, Block, or resource identifier to be saved in the state, the data is read from the AWS Go SDK type for the field (*string), if necessary converted into the equivalent Plugin Framework compatible type, and saved into a mutated data struct
- Function is returned
=== "Terraform Plugin SDK V2"
- The CreateWithoutTimeout function of a schema.Resource is invoked with *schema.ResourceData containing the planned new state data (conventionally named d) and an AWS API client (conventionally named meta).
- Before reaching this point, the ResourceData was already translated from the Terraform Plugin Protocol data types by the Terraform Plugin SDK so values can be read by invoking d.Get() and d.GetOk() receiver methods with Attribute and Block names from the Schema of the schema.Resource.
- An AWS Go SDK operation input type (e.g., ec2.CreateVpcInput) is initialized
- For each necessary field to configure in the operation input type, the data is read from the ResourceData (e.g., d.Get(), d.GetOk()) and converted into the AWS Go SDK type for the field (e.g., *string)
- The AWS Go SDK operation is invoked and the output type (e.g., *ec2.CreateVpcOutput) is initialized
- For each necessary Attribute, Block, or resource identifier to be saved in the state, the data is read from the AWS Go SDK type for the field (*string), if necessary converted into a ResourceData compatible type, and saved into a mutated ResourceData (e.g., d.Set(), d.SetId())
- Function is returned
To further understand the necessary data conversions used throughout the Terraform AWS Provider codebase between AWS Go SDK types and the Terraform Plugin SDK, the following table can be referenced for most scenarios:
=== "Terraform Plugin Framework (Preferred)" <!-- markdownlint-disable no-inline-html --->
| AWS API Model | AWS Go SDK V2 | Terraform Plugin Framework | Terraform Language/State |
|---------------|---------------|----------------------------|--------------------------|
| `boolean` | `bool` | `types.Bool` | `bool` |
| `float` | `*float64`
*float32 | types.Float64
types.Float32 | number |
| integer | *int64
*int32 | types.Int64
types.Int32 | number |
| list | []T | types.List
types.Set | list(any)
set(any) |
| map | map[string]T | types.Map | map(any) |
| string | *string | types.String | string |
| structure | struct | types.List with MaxItems: 1 | list(object(any)) |
| timestamp | *time.Time | types.String (typically RFC3339 formatted) | string |
<!-- markdownlint-enable no-inline-html --->
[Types](https://developer.hashicorp.com/terraform/plugin/framework/handling-data/types) are built into the Terraform Plugin Framework library and handle null and unknown values in accordance with the [Terraform type system](https://developer.hashicorp.com/terraform/plugin/framework/handling-data/terraform-concepts#type-system).
This eliminates the need for any special handling of zero values and provides better change detection on unset values.
=== "Terraform Plugin SDK V2" <!-- markdownlint-disable no-inline-html --->
| AWS API Model | AWS Go SDK | Terraform Plugin SDK | Terraform Language/State |
|---------------|------------|----------------------|--------------------------|
| `boolean` | `*bool` | `TypeBool` (`bool`) | `bool` |
| `float` | `*float64`
*float32 | TypeFloat (float64) | number |
| integer | *int64
*int32 | TypeInt (int) | number |
| list | []T | TypeList ([]any of T)
TypeSet (*schema.Set of T) | list(any)
set(any) |
| map | map[string]T | TypeMap (map[string]any) | map(any) |
| string | *string | TypeString (string) | string |
| structure | struct | TypeList ([]any of map[string]any) with MaxItems: 1 | list(object(any)) |
| timestamp | *time.Time | TypeString (typically RFC3339 formatted) | string |
<!-- markdownlint-enable no-inline-html --->
You may notice there are type encoding differences between the AWS Go SDK and Terraform Plugin SDK:
- AWS Go SDK types are mostly Go pointer types, while Terraform Plugin SDK types are not.
- AWS Go SDK structures are the Go `struct` type, while there is no semantically equivalent Terraform Plugin SDK type. Instead they are represented as a slice of interfaces with an underlying map of interfaces.
- AWS Go SDK types are all Go concrete types, while the Terraform Plugin SDK types for collections and maps are interfaces.
- AWS Go SDK whole numeric type is always 32 or 64-bit, while the Terraform Plugin SDK type is implementation-specific.
Conceptually, the first and second items above are the most problematic in the Terraform AWS Provider codebase. The first item because non-pointer types in Go cannot implement the concept of no value (`nil`). The [Zero Value Mapping section](#zero-value-mapping) will go into more detail about the implications of this limitation. The second item because it can be confusing to always handle a structure ("object") type as a list.
If an AWS API sets a default value on the server side, no default should be set on the provider side.
Instead, the argument should be marked as Optional and Computed.
This avoids potential future conflicts if a server side default value changes.
As a general rule, provider side default values should be avoided unless strictly necessary for a resource to function properly.
!!! note This section only applies to Plugin SDK V2 based resources. Terraform Plugin Framework based resources will handle null and unknown values distinctly from zero values.
As mentioned in the Type Mapping section for Terraform Plugin SDK V2, there is a discrepancy between how the Terraform Plugin SDK represents values and the reality that a Terraform State may not configure an Attribute. These values will default to the matching underlying Go type "zero value" if not set:
| Terraform Plugin SDK | Go Type | Zero Value |
|---|---|---|
TypeBool | bool | false |
TypeFloat | float64 | 0.0 |
TypeInt | int | 0 |
TypeString | string | "" |
For Terraform resource logic this means that these special values must always be accounted for in implementation. The semantics of the API and its meaning of the zero value will determine whether:
The maintainers can provide guidance on appropriate solutions for cases not mentioned in the Recommended Implementation section.
=== "Terraform Plugin Framework (Preferred)"
All Attributes and Blocks at the top level of a resource structs Schema method are considered "root" attributes.
These will always be handled with the Plan and State fields from the request and response pointers passed in as arguments the the CRUD methods on the resource struct.
Values are read from and written to the underlying data structure during CRUD operations, and finally written to state in the response object with a call like resp.Diagnostics.Append(resp.State.Set(ctx, &plan)...).
=== "Terraform Plugin SDK V2"
All Attributes and Blocks at the top level of schema.Resource Schema are considered "root" attributes.
These will always be handled with receiver methods on ResourceData, such as reading with d.Get(), d.GetOk(), etc. and writing with d.Set().
Any nested Attributes and Blocks inside those root Blocks will then be handled with standard Go types according to the table in the Type Mapping section.
!!! warning
While it is possible in certain type scenarios to deeply read and write ResourceData information for a Block Attribute, this practice is discouraged in preference of only handling root Attributes and Blocks.
Given the complexities around conversions between AWS and Terraform Plugin type systems, this section contains recommended implementations for Terraform AWS Provider resources.
!!! tip Some of these coding patterns may not be well represented in the codebase, as refactoring the many older styles over years of community development is a large task. However this is meant to represent the preferred implementations today. These will continue to evolve as this codebase and the Terraform Plugin ecosystem changes.
When using the Terraform Plugin Framework, there are two approaches for flattening and expanding Terraform data. The preferred is AutoFlex, which automatically converts between provider and AWS API structures by analyzing type information. Alternatively, provider developers can define flattening and expanding functions manually.
When using the Terraform Plugin SDK v2, flattening and expanding functions must be defined manually.
AutoFlex provides two entry-point functions, Flatten and Expand defined in the package github.com/hashicorp/terraform-provider-aws/internal/framework/flex.
Without configuration, these two functions should be able to convert most provider and AWS API structures.
AutoFlex uses field names to map between the source and target structures:
flex.WithFieldNamePrefix, e.g. Lex v2 Intents in internal/service/lexv2models/intent.goBy default, AutoFlex ignores fields with the name Tags, as AWS resource tags are handled separately.
Additional fields can be ignored and the Tags field can be included by passing optional flex.AutoFlexOptionsFuncs to Flatten or Expand.
For example, to add an additional ignored field, use
diags := flex.Expand(ctx, source, &target, flex.WithIgnoredFieldNamesAppend("OtherField"))
This will ignore both Tags and OtherField.
To empty the list of ignored fields, use flex.WithNoIgoredFieldNames.
For example, to include Tags, call
diags := flex.Expand(ctx, source, &target, flex.WithNoIgnoredFieldNames())
AutoFlex is able to convert single-element lists from Terraform blocks into single struct or pointer values in AWS API structs.
The flexing of individual struct fields can be customized by using Go struct tags, with the namespace autoflex.
Tag values are comma-separated lists of options, with a leading comma.
The option legacy can be used when migrating a resource or data source from the Terraform Plugin SDK to the Terraform Plugin Framework.
This will preserve certain behaviors from the Plugin SDK, such as treating zero-values, i.e. the empty string or a numeric zero, equivalently to null values.
This is equivalent to calling the fwflex.<Type><To/From>FrameworkLegacy functions.
For example, from the struct resourceManagedUserPoolClientModel for the Cognito IDP Managed User Pool Client:
type resourceManagedUserPoolClientModel struct {
AccessTokenValidity types.Int64 `tfsdk:"access_token_validity" autoflex:",legacy"`
AllowedOauthFlows types.Set `tfsdk:"allowed_oauth_flows"`
...
ClientSecret types.String `tfsdk:"client_secret"`
DefaultRedirectUri types.String `tfsdk:"default_redirect_uri" autoflex:",legacy"`
...
ID types.String `tfsdk:"id"`
IdTokenValidity types.Int64 `tfsdk:"id_token_validity" autoflex:",legacy"`
LogoutUrls types.Set `tfsdk:"logout_urls"`
...
}
The option omitempty can be used with string values to store a null value when an empty string is returned.
For example, from the struct refreshOnDayModel for the QuickSight Refresh Schedule:
type refreshOnDayModel struct {
DayOfMonth types.String `tfsdk:"day_of_month"`
DayOfWeek types.String `tfsdk:"day_of_week" autoflex:",omitempty"`
}
To completely ignore a field, use the tag value -.
For example, from the struct scheduleModel for the QuickSight Refresh Schedule:
type scheduleModel struct {
RefreshType types.String `tfsdk:"refresh_type"`
ScheduleFrequency fwtypes.ListNestedObjectValueOf[refreshFrequencyModel] `tfsdk:"schedule_frequency"`
StartAfterDateTime types.String `tfsdk:"start_after_date_time" autoflex:"-"`
}
To ignore a field when expanding, but include it when flattening, use the option noexpand.
To ignore a field when flattening, but include it when expanding, use the option noflatten.
For example, from the struct dataSourceReservedCacheNodeOfferingModel for the ElastiCache Reserved Cache Node Offering:
type dataSourceReservedCacheNodeOfferingModel struct {
CacheNodeType types.String `tfsdk:"cache_node_type"`
Duration fwtypes.RFC3339Duration `tfsdk:"duration" autoflex:",noflatten"`
FixedPrice types.Float64 `tfsdk:"fixed_price"`
OfferingID types.String `tfsdk:"offering_id"`
OfferingType types.String `tfsdk:"offering_type"`
ProductDescription types.String `tfsdk:"product_description"`
}
In some cases, flattening and expanding need conditional handling. One important case is new AWS API implementations where the input or output structs make use of union types. The AWS implementation uses an interface as the common type, along with various concrete implementations. Because the Terraform schema does not support union types (see this issue for discussion), the provider defines nested schemas for each type with a restriction to allow only one.
To override flattening behavior, implement the interface flex.Flattener on the model.
The function should have a pointer receiver, as it will modify the struct in-place.
From the Mainframe Modernization (M2) environment (internal/service/m2/environment.go):
type storageConfigurationModel struct {
EFS fwtypes.ListNestedObjectValueOf[efsStorageConfigurationModel] `tfsdk:"efs"`
FSX fwtypes.ListNestedObjectValueOf[fsxStorageConfigurationModel] `tfsdk:"fsx"`
}
func (m *storageConfigurationModel) Flatten(ctx context.Context, v any) (diags diag.Diagnostics) {
switch t := v.(type) {
case awstypes.StorageConfigurationMemberEfs:
var model efsStorageConfigurationModel
d := fwflex.Flatten(ctx, t.Value, &model)
diags.Append(d...)
if diags.HasError() {
return diags
}
m.EFS = fwtypes.NewListNestedObjectValueOfPtrMust(ctx, &model)
return diags
case awstypes.StorageConfigurationMemberFsx:
var model fsxStorageConfigurationModel
d := fwflex.Flatten(ctx, t.Value, &model)
diags.Append(d...)
if diags.HasError() {
return diags
}
m.FSX = fwtypes.NewListNestedObjectValueOfPtrMust(ctx, &model)
return diags
default:
return diags
}
}
To override expanding behavior, implement the interface flex.Expander on the model.
As the function should not modify the struct in-place, it should not have a pointer receiver.
From the Mainframe Modernization (M2) environment (internal/service/m2/environment.go):
type storageConfigurationModel struct {
EFS fwtypes.ListNestedObjectValueOf[efsStorageConfigurationModel] `tfsdk:"efs"`
FSX fwtypes.ListNestedObjectValueOf[fsxStorageConfigurationModel] `tfsdk:"fsx"`
}
func (m storageConfigurationModel) Expand(ctx context.Context) (result any, diags diag.Diagnostics) {
switch {
case !m.EFS.IsNull():
efsStorageConfigurationData, d := m.EFS.ToPtr(ctx)
diags.Append(d...)
if diags.HasError() {
return nil, diags
}
var r awstypes.StorageConfigurationMemberEfs
diags.Append(fwflex.Expand(ctx, efsStorageConfigurationData, &r.Value)...)
if diags.HasError() {
return nil, diags
}
return &r, diags
case !m.FSX.IsNull():
fsxStorageConfigurationData, d := m.FSX.ToPtr(ctx)
diags.Append(d...)
if diags.HasError() {
return nil, diags
}
var r awstypes.StorageConfigurationMemberFsx
diags.Append(fwflex.Expand(ctx, fsxStorageConfigurationData, &r.Value)...)
if diags.HasError() {
return nil, diags
}
return &r, diags
}
return nil, diags
}
In some cases, the result types for expanding will be different when creating or updating a resource.
For example, for the Verified Permissions identity source, the create operation takes a Configuration struct while the update operation takes an UpdateConfiguration, even though the contents are identical.
In this case, implement the interface flex.TypedExpander on the model.
From the Verified Permissions identity source (internal/service/verifiedpermissions/identity_source.go):
type configuration struct {
CognitoUserPoolConfiguration fwtypes.ListNestedObjectValueOf[cognitoUserPoolConfiguration] `tfsdk:"cognito_user_pool_configuration"`
OpenIDConnectConfiguration fwtypes.ListNestedObjectValueOf[openIDConnectConfiguration] `tfsdk:"open_id_connect_configuration"`
}
func (m configuration) ExpandTo(ctx context.Context, targetType reflect.Type) (result any, diags diag.Diagnostics) {
switch targetType {
case reflect.TypeFor[awstypes.Configuration]():
return m.expandToConfiguration(ctx)
case reflect.TypeFor[awstypes.UpdateConfiguration]():
return m.expandToUpdateConfiguration(ctx)
}
return nil, diags
}
func (m configuration) expandToConfiguration(ctx context.Context) (result awstypes.Configuration, diags diag.Diagnostics) {
switch {
case !m.CognitoUserPoolConfiguration.IsNull():
var result awstypes.ConfigurationMemberCognitoUserPoolConfiguration
diags.Append(flex.Expand(ctx, m.CognitoUserPoolConfiguration, &result.Value)...)
if diags.HasError() {
return nil, diags
}
return &result, diags
case !m.OpenIDConnectConfiguration.IsNull():
var result awstypes.ConfigurationMemberOpenIdConnectConfiguration
diags.Append(flex.Expand(ctx, m.OpenIDConnectConfiguration, &result.Value)...)
if diags.HasError() {
return nil, diags
}
return &result, diags
}
return nil, diags
}
func (m configuration) expandToUpdateConfiguration(ctx context.Context) (result awstypes.UpdateConfiguration, diags diag.Diagnostics) {
switch {
case !m.CognitoUserPoolConfiguration.IsNull():
var result awstypes.UpdateConfigurationMemberCognitoUserPoolConfiguration
diags.Append(flex.Expand(ctx, m.CognitoUserPoolConfiguration, &result.Value)...)
if diags.HasError() {
return nil, diags
}
return &result, diags
case !m.OpenIDConnectConfiguration.IsNull():
var result awstypes.UpdateConfigurationMemberOpenIdConnectConfiguration
diags.Append(flex.Expand(ctx, m.OpenIDConnectConfiguration, &result.Value)...)
if diags.HasError() {
return nil, diags
}
return &result, diags
}
return nil, diags
}
AutoFlex can output detailed logging as it flattens or expands a value.
To turn on logging for AutoFlex, use the environment variable TF_LOG_AWS_AUTOFLEX to set the logging level.
Valid values are ERROR, WARN, INFO, DEBUG, and TRACE.
By default, AutoFlex logging is set to ERROR.
By convention in the codebase, each level of Block handling beyond root attributes should be separated into "expand" functions that convert Terraform Plugin SDK data into the equivalent AWS Go SDK type (typically named expand{Service}{Type}) and "flatten" functions that convert an AWS Go SDK type into the equivalent Terraform Plugin SDK data (typically named flatten{Service}{Type}).
Define FLatten and EXpand (i.e., flex) functions at the most local level possible. This table provides guidance on the preferred place to define flex functions based on usage.
| Where Used | Where to Define | Include Service in Name |
|---|---|---|
One resource (e.g., aws_instance) | Resource file (e.g., internal/service/ec2/instance.go) | No |
Few resources in one service (e.g., EC2) | Resource file or service flex file (e.g., internal/service/ec2/flex.go) | No |
Widely used in one service (e.g., EC2) | Service flex file (e.g., internal/service/ec2/flex.go) | No |
Two services (e.g., EC2 and EKS) | Define a copy in each service | If helpful |
| 3+ services | internal/flex/flex.go | Yes |
=== "Terraform Plugin Framework (Preferred)" ```go func expandStructure(tfList []structureData) *service.Structure { if len(tfList) == 0 { return nil }
tfObj := tfList[0]
apiObject := &service.Structure{}
// ... nested attribute handling ...
return apiObject
}
func expandStructures(tfList []structureData) []*service.Structure {
if len(tfList) == 0 {
return nil
}
var apiObjects []*service.Structure
for _, tfObj := range tfList {
apiObject := &service.Structure{}
// ... nested attribute handling ...
apiObjects = append(apiObjects, apiObject)
}
return apiObjects
}
```
=== "Terraform Plugin SDK V2" ```go func expandStructure(tfMap map[string]any) *service.Structure { if tfMap == nil { return nil }
apiObject := &service.Structure{}
// ... nested attribute handling ...
return apiObject
}
func expandStructures(tfList []any) []service.Structure {
if len(tfList) == 0 {
return nil
}
var apiObjects []service.Structure
for _, tfMapRaw := range tfList {
tfMap, ok := tfMapRaw.(map[string]any)
if !ok {
continue
}
apiObject := expandStructure(tfMap)
if apiObject == nil {
continue
}
apiObjects = append(apiObjects, *apiObject)
}
return apiObjects
}
```
=== "Terraform Plugin Framework (Preferred)" ```go func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) { var diags diag.Diagnostics elemType := types.ObjectType{AttrTypes: structureAttrTypes}
if apiObject == nil {
return types.ListNull(elemType), diags
}
obj := map[string]attr.Value{
// ... nested attribute handling ...
}
objVal, d := types.ObjectValue(structureAttrTypes, obj)
diags.Append(d...)
listVal, d := types.ListValue(elemType, []attr.Value{objVal})
diags.Append(d...)
return listVal, diags
}
func flattenStructures(ctx context.Context, apiObjects []*service.Structure) (types.List, diag.Diagnostics) {
var diags diag.Diagnostics
elemType := types.ObjectType{AttrTypes: structureAttrTypes}
if len(apiObjects) == 0 {
return types.ListNull(elemType), diags
}
elems := []attr.Value{}
for _, apiObject := range apiObjects {
if apiObject == nil {
continue
}
obj := map[string]attr.Value{
// ... nested attribute handling ...
}
objVal, d := types.ObjectValue(structureAttrTypes, obj)
diags.Append(d...)
elems = append(elems, objVal)
}
listVal, d := types.ListValue(elemType, elems)
diags.Append(d...)
return listVal, diags
}
```
=== "Terraform Plugin SDK V2" ```go func flattenStructure(apiObject *service.Structure) map[string]any { if apiObject == nil { return nil }
tfMap := map[string]any{}
// ... nested attribute handling ...
return tfMap
}
func flattenStructures(apiObjects []service.Structure) []any {
if len(apiObjects) == 0 {
return nil
}
var tfList []any
for _, apiObject := range apiObjects {
tfList = append(tfList, flattenStructure(&apiObject))
}
return tfList
}
```
=== "Terraform Plugin Framework (Preferred)" To read, if always sending the attribute value is correct:
```go
input := service.ExampleOperationInput{
AttributeName: plan.AttributeName.ValueBoolPointer()
}
```
Alternatively, if only sending the attribute value when `true`:
```go
input := service.ExampleOperationInput{}
if v := plan.AttributeName.ValueBool(); v {
input.AttributeName = aws.Bool(v)
}
```
Or, if only sending the attribute value when it is known and not null:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsUnknown() && !plan.AttributeName.IsNull() {
input.AttributeName = plan.AttributeName.ValueBoolPointer()
}
```
To write:
```go
plan.AttributeName = flex.BoolToFramework(output.Thing.AttributeName)
```
=== "Terraform Plugin SDK V2" To read, if always sending the attribute value is correct:
```go
input := service.ExampleOperationInput{
AttributeName: aws.String(d.Get("attribute_name").(bool))
}
```
Otherwise to read, if only sending the attribute value when `true` is preferred (`!ok` for opposite):
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok {
input.AttributeName = aws.Bool(v.(bool))
}
```
To write:
```go
d.Set("attribute_name", output.Thing.AttributeName)
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
input.AttributeName = plan.AttributeName.ValueFloat64Pointer()
}
```
To write:
```go
plan.AttributeName = flex.Float64ToFramework(output.Thing.AttributeName)
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok {
input.AttributeName = aws.Float64(v.(float64))
}
```
To write:
```go
d.Set("attribute_name", output.Thing.AttributeName)
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
input.AttributeName = plan.AttributeName.ValueInt64Pointer()
}
```
To write:
```go
plan.AttributeName = flex.Int64ToFramework(output.Thing.AttributeName)
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok {
input.AttributeName = aws.Int64(int64(v.(int)))
}
```
To write:
```go
d.Set("attribute_name", output.Thing.AttributeName)
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
var tfList []attributeNameData
resp.Diagnostics.Append(plan.AttributeName.ElementsAs(ctx, &tfList, false)...)
if resp.Diagnostics.HasError() {
return
}
input.AttributeName = expandStructures(tfList)
}
```
To write:
```go
attributeName, d := flattenStructures(ctx, output.Thing.AttributeName))
resp.Diagnostics.Append(d...)
state.AttributeName = attributeName
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok && len(v.([]any)) > 0 {
input.AttributeName = expandStructures(v.([]any))
}
```
To write:
```go
if err := d.Set("attribute_name", flattenStructures(output.Thing.AttributeName)); err != nil {
return sdkdiag.AppendErrorf(diags, "setting attribute_name: %s", err)
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
var tfList []attributeNameData
resp.Diagnostics.Append(plan.AttributeName.ElementsAs(ctx, &tfList, false)...)
if resp.Diagnostics.HasError() {
return
}
// expander handles translating list with 1 item to a single AWS object
input.AttributeName = expandStructure(tfList)
}
```
To write:
```go
// flattener handles nil output, returning the equivalent null Terraform type
attributeName, d := flattenStructures(ctx, output.Thing.AttributeName))
resp.Diagnostics.Append(d...)
state.AttributeName = attributeName
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok && len(v.([]any)) > 0 && v.([]any)[0] != nil {
input.AttributeName = expandStructure(v.([]any)[0].(map[string]any))
}
```
To write:
```go
if output.Thing.AttributeName != nil {
if err := d.Set("attribute_name", []any{flattenStructure(output.Thing.AttributeName)}); err != nil {
return sdkdiag.AppendErrorf(diags, "setting attribute_name: %s", err)
}
} else {
d.Set("attribute_name", nil)
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
input.AttributeName = flex.ExpandFrameworkStringValueList(ctx, plan.AttributeName)
}
```
To write:
```go
plan.AttributeName = flex.FlattenFrameworkStringValueList(output.Thing.AttributeName)
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok && len(v.([]any)) > 0 {
input.AttributeName = flex.ExpandStringValueList(v.([]any))
}
```
To write:
```go
d.Set("attribute_name", output.Thing.AttributeName)
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
input.AttributeName = flex.ExpandFrameworkStringValueMap(ctx, plan.AttributeName)
}
```
To write:
```go
plan.AttributeName = flex.FlattenFrameworkStringValueMap(output.Thing.AttributeName)
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok && len(v.(map[string]any)) > 0 {
input.AttributeName = flex.ExpandStringValueMap(v.(map[string]any))
}
```
To write:
```go
d.Set("attribute_name", output.Thing.AttributeName)
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
var tfList []attributeNameData
resp.Diagnostics.Append(plan.AttributeName.ElementsAs(ctx, &tfList, false)...)
if resp.Diagnostics.HasError() {
return
}
input.AttributeName = expandStructure(tfList)
}
```
To write:
```go
// flattener handles nil output, returning the equivalent null Terraform type
attributeName, d := flattenStructures(ctx, output.Thing.AttributeName))
resp.Diagnostics.Append(d...)
state.AttributeName = attributeName
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok && v.(*schema.Set).Len() > 0 {
input.AttributeName = expandStructures(v.(*schema.Set).List())
}
```
To write:
```go
if err := d.Set("attribute_name", flattenStructures(output.Thing.AttributeNames)); err != nil {
return sdkdiag.AppendErrorf(diags, "setting attribute_name: %s", err)
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
input.AttributeName = flex.ExpandFrameworkStringValueSet(ctx, plan.AttributeName)
}
```
To write:
```go
plan.AttributeName = flex.FlattenFrameworkStringValueSet(output.Thing.AttributeName)
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok && v.(*schema.Set).Len() > 0 {
input.AttributeName = flex.ExpandStringValueSet(v.(*schema.Set))
}
```
To write:
```go
d.Set("attribute_name", output.Thing.AttributeName)
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
input.AttributeName = plan.AttributeName.ValueStringPointer()
}
```
To write:
```go
plan.AttributeName = flex.StringToFramework(output.Thing.AttributeName)
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok {
input.AttributeName = aws.String(v.(string))
}
```
To write:
```go
d.Set("attribute_name", output.Thing.AttributeName)
```
=== "Terraform Plugin Framework (Preferred)" To ensure that parsing the read string value does not fail, use the RFC3339 timetype.
To read:
```go
input := service.ExampleOperationInput{}
if !plan.AttributeName.IsNull() {
attributeName, d := plan.AttributeName.ValueRFC3339Time()
resp.Diagnostics.Append(d...)
input.AttributeName = aws.Time(attributeName)
}
```
To write:
```go
plan.AttributeName = timetypes.NewRFC3339ValueMust(aws.ToTime(output.Thing.AttributeName).Format(time.RFC3339))
```
=== "Terraform Plugin SDK V2"
To ensure that parsing the read string value does not fail, define attribute_name's schema.Schema with an appropriate ValidateFunc:
```go
"attribute_name": {
Type: schema.TypeString,
// ...
ValidateFunc: validation.IsRFC3339Time,
},
```
To read:
```go
input := service.ExampleOperationInput{}
if v, ok := d.GetOk("attribute_name"); ok {
v, _ := time.Parse(time.RFC3339, v.(string))
input.AttributeName = aws.Time(v)
}
```
To write:
```go
if output.Thing.AttributeName != nil {
d.Set("attribute_name", aws.TimeValue(output.Thing.AttributeName).Format(time.RFC3339))
} else {
d.Set("attribute_name", nil)
}
```
=== "Terraform Plugin Framework (Preferred)" To read, if always sending the attribute value is correct:
```go
func expandStructure(tfList []structureData) *service.Structure {
// ...
apiObject.NestedAttributeName = tfObj.NestedAttributeName.ValueBoolPointer()
// ...
}
```
To read, if only sending the attribute value when known and not nil:
```go
func expandStructure(tfList []structureData) *service.Structure {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
apiObject.NestedAttributeName = tfObj.NestedAttributeName.ValueBoolPointer()
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flex will handle setting null when appropriate
obj["nested_attribute_name"] = flex.BoolToFramework(ctx, apiObject.NestedAttributeName)
// ...
}
```
=== "Terraform Plugin SDK V2" To read, if always sending the attribute value is correct:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(bool); ok {
apiObject.NestedAttributeName = aws.Bool(v)
}
// ...
}
```
To read, if only sending the attribute value when `true` is preferred (`!v` for opposite):
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(bool); ok && v {
apiObject.NestedAttributeName = aws.Bool(v)
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = aws.ToBool(v)
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(tfList []structureData) *service.Structure {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
apiObject.NestedAttributeName = tfObj.NestedAttributeName.ValueFloat64Pointer()
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flex will handle setting null when appropriate
obj["nested_attribute_name"] = flex.Float64ToFramework(ctx, apiObject.NestedAttributeName)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(float64); ok && v != 0.0 {
apiObject.NestedAttributeName = aws.Float64(v)
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = aws.ToFloat64(v)
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(tfList []structureData) *service.Structure {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
apiObject.NestedAttributeName = tfObj.NestedAttributeName.ValueInt64Pointer()
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flex will handle setting null when appropriate
obj["nested_attribute_name"] = flex.Int64ToFramework(ctx, apiObject.NestedAttributeName)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(int); ok && v != 0 {
apiObject.NestedAttributeName = aws.Int64(int64(v))
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = aws.ToInt64(v)
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(ctx context.Context, tfList []structureData) (*service.Structure, diag.Diagnostics) {
// ...
var nested []nestedAttributeNameData
diags.Append(tfObj.NestedAttributeName.ElementsAs(ctx, &nested, false)...)
// expand will handle null when appropriate
apiObject.NestedAttributeName = expandNestedAttributeName(nested)
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flatten will handle setting null when appropriate
obj["nested_attribute_name"] = flattenNestedAttributeName(ctx, v)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].([]any); ok && len(v) > 0 {
apiObject.NestedAttributeName = expandStructures(v)
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = flattenNestedStructures(v)
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(ctx context.Context, tfList []structureData) (*service.Structure, diag.Diagnostics) {
// ...
var nested []nestedAttributeNameData
diags.Append(tfObj.NestedAttributeName.ElementsAs(ctx, &nested, false)...)
// expand will handle null when appropriate
apiObject.NestedAttributeName = expandNestedAttributeName(nested)
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flatten will handle setting null when appropriate
obj["nested_attribute_name"] = flattenNestedAttributeName(ctx, v)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].([]any); ok && len(v) > 0 && v[0] != nil {
apiObject.NestedAttributeName = expandStructure(v[0].(map[string]any))
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = []any{flattenNestedStructure(v)}
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(ctx context.Context, tfList []structureData) (*service.Structure, diag.Diagnostics) {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
apiObject.NestedAttributeName = flex.ExpandFrameworkStringList(ctx, tfObj.NestedAttributeName)
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flex will handle setting null when appropriate
obj["nested_attribute_name"] = flex.FlattenFrameworkStringList(ctx, v)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].([]any); ok && len(v) > 0 {
apiObject.NestedAttributeName = flex.ExpandStringValueList(v)
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = v
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(ctx context.Context, tfList []structureData) (*service.Structure, diag.Diagnostics) {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
apiObject.NestedAttributeName = flex.ExpandFrameworkStringMap(ctx, tfObj.NestedAttributeName)
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flex will handle setting null when appropriate
obj["nested_attribute_name"] = flex.FlattenFrameworkStringMap(ctx, v)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
input := service.ExampleOperationInput{}
if v, ok := tfMap["nested_attribute_name"].(map[string]any); ok && len(v) > 0 {
apiObject.NestedAttributeName = flex.ExpandStringValueMap(v)
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = v
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(ctx context.Context, tfList []structureData) (*service.Structure, diag.Diagnostics) {
// ...
var nested []nestedAttributeNameData
diags.Append(tfObj.NestedAttributeName.ElementsAs(ctx, &nested, false)...)
// expand will handle null when appropriate
apiObject.NestedAttributeName = expandNestedAttributeName(nested)
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flatten will handle setting null when appropriate
obj["nested_attribute_name"] = flattenNestedAttributeName(ctx, v)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(*schema.Set); ok && v.Len() > 0 {
apiObject.NestedAttributeName = expandStructures(v.List())
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = flattenNestedStructures(v)
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(ctx context.Context, tfList []structureData) (*service.Structure, diag.Diagnostics) {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
apiObject.NestedAttributeName = flex.ExpandFrameworkStringSet(ctx, tfObj.NestedAttributeName)
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flex will handle setting null when appropriate
obj["nested_attribute_name"] = flex.FlattenFrameworkStringSet(ctx, v)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(*schema.Set); ok && v.Len() > 0 {
apiObject.NestedAttributeName = flex.ExpandStringValueSet(v)
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = v
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To read:
```go
func expandStructure(tfList []structureData) *service.Structure {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
apiObject.NestedAttributeName = tfObj.NestedAttributeName.ValueStringPointer()
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
// flex will handle setting null when appropriate
obj["nested_attribute_name"] = flex.StringToFramework(ctx, apiObject.NestedAttributeName)
// ...
}
```
=== "Terraform Plugin SDK V2" To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(string); ok && v != "" {
apiObject.NestedAttributeName = aws.String(v)
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = aws.ToString(v)
}
// ...
}
```
=== "Terraform Plugin Framework (Preferred)" To ensure that parsing the read string value does not fail, use the RFC3339 timetype.
To read:
```go
func expandStructure(tfList []structureData) (*service.Structure, diag.Diagnostics) {
// ...
if !tfObj.NestedAttributeName.IsUnknown() && !tfObj.NestedAttributeName.IsNull() {
nested := tfObj.NestedAttributeName.ValueRFC3339Time()
diags.Append(tfObj.NestedAttributeName.ElementsAs(ctx, &nested, false)...)
apiObject.NestedAttributeName = aws.Time(nested)
}
// ...
}
```
To write:
```go
func flattenStructure(ctx context.Context, apiObject *service.Structure) (types.List, diag.Diagnostics) {
// ...
obj["nested_attribute_name"] = timetypes.NewRFC3339ValueMust(aws.ToTime(apiObject.NestedAttributeName).Format(time.RFC3339))
// ...
}
```
=== "Terraform Plugin SDK V2"
To ensure that parsing the read string value does not fail, define nested_attribute_name's schema.Schema with an appropriate ValidateFunc:
```go
"nested_attribute_name": {
Type: schema.TypeString,
// ...
ValidateFunc: validation.IsRFC3339Time,
},
```
To read:
```go
func expandStructure(tfMap map[string]any) *service.Structure {
// ...
if v, ok := tfMap["nested_attribute_name"].(string); ok && v != "" {
v, _ := time.Parse(time.RFC3339, v)
apiObject.NestedAttributeName = aws.Time(v)
}
// ...
}
```
To write:
```go
func flattenStructure(apiObject *service.Structure) map[string]any {
// ...
if v := apiObject.NestedAttributeName; v != nil {
tfMap["nested_attribute_name"] = aws.ToTime(v).Format(time.RFC3339)
}
// ...
}
```
This section includes additional topics related to data design and decision making from the Terraform AWS Provider maintainers.
Certain resources may need to interact with binary (non UTF-8) data while the Terraform State only supports UTF-8 data. Configurations attempting to pass binary data to an attribute will receive an error from Terraform CLI. These attributes should expect and store the value as a Base64 string while performing any necessary encoding or decoding in the resource logic.
During resource destroy operations, only previously applied Terraform State values are available to resource logic. Even if the configuration is updated in a manner where both the resource destroy is triggered (e.g., setting the resource meta-argument count = 0) and an attribute value is updated, the resource logic will only have the previously applied data values.
Any usage of attribute values during destroy should explicitly note in the resource documentation that the desired value must be applied into the Terraform State before any apply to destroy the resource.
Attribute values may be very lengthy or potentially contain Sensitive Values. A potential solution might be to use a hashing algorithm, such as MD5 or SHA256, to convert the value before saving in the Terraform State to reduce its relative size or attempt to obfuscate the value. However, there are a few reasons not to do so:
abc123 -> def456 in plans.Any value hashing implementation will not be accepted. An exception to this guidance is if the remote system explicitly provides a separate hash value in responses, in which a resource can provide a separate attribute with that hashed value.
Marking an Attribute in the Terraform Plugin Framework Schema with Sensitive has the following real-world implications:
provider_sensitive_attrs experiment enabled) and later, any downstream references to the value in other configurations will hide the value in plan difference output.The value is either always hidden or not as the Terraform Plugin Framework does not currently implement conditional support for this functionality. Since Terraform Configurations have no control over the behavior, hiding values from the plan difference can incur a potentially undesirable user experience cost for operators.
Given that and especially with the improvements in Terraform CLI 0.14, the Terraform AWS Provider maintainers guiding principles for determining whether an Attribute should be marked as Sensitive is if an Attribute value:
Sensitive.If you are unsatisfied with sensitive value handling, the maintainers can recommend ensuring there is a covering issue in the Terraform CLI and/or Terraform Plugin Framework projects explaining the use case. Ultimately, Terraform Plugins including the Terraform AWS Provider cannot implement their own sensitive value abilities if the upstream projects do not implement the appropriate functionality.
Attributes which only exist within Terraform and not the remote system are typically referred to as virtual attributes. Especially in the case of Destroy State Values, these attributes rely on the Implicit State Passthrough behavior of values in Terraform to be available in resource logic. A fictitious example of one of these may be a resource attribute such as a skip_waiting flag, which is used only in the resource logic to skip the typical behavior of waiting for operations to complete.
If a virtual attribute has a default value that does not match the Zero Value Mapping for the type, it is recommended to explicitly call d.Set() with the default value in the schema.Resource Importer State function, for example:
=== "Teraform Plugin Framework (Preferred)" <!-- markdownlint-disable no-space-in-emphasis --> ```go (r *ThingResource) ImportState(ctx context.Context, req resource.ImportStateRequest, resp *resource.ImportStateResponse) { // ... Other import activity
resp.Diagnostics.Append(resp.State.SetAttribute(ctx, path.Root("skip_waiting"), true)...)
}
```
<!-- markdownlint-enable no-space-in-emphasis -->
=== "Teraform Plugin SDK V2" ```go &schema.Resource{ // ... other fields ... Importer: &schema.ResourceImporter{ State: func(d *schema.ResourceData, meta any) ([]*schema.ResourceData, error) { d.Set("skip_waiting", true)
return []*schema.ResourceData{d}, nil
},
},
}
```
This helps prevent an immediate plan difference after resource import unless the configuration has a non-default value.
When using the Terraform Plugin SDK V2 the recommendation is to use methods on the (ResourceData)[https://pkg.go.dev/github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema#ResourceData] structure to access root attributes. In certain scenarios access to the Terraform config, plan or state may be required; use the GetRawConfig, GetRawPlan and GetRawState methods, which return cty values.
The ToFramework function can be used to convert a cty object value into a Terraform Plugin Framework model. For example,
import (
tfcty "github.com/hashicorp/terraform-provider-aws/internal/cty"
...
)
type scheduleModel struct {
RefreshType types.String `tfsdk:"refresh_type"`
...
}
var data scheduleModel
err := tfcty.ToFramework(ctx, d.GetRawConfig(), &data)
Below is a listing of relevant terms and descriptions for data handling and conversion in the Terraform AWS Provider to establish common conventions throughout this documentation. This list is not exhaustive of all concepts of Terraform Plugins, the Terraform AWS Provider, or the data handling that occurs during Terraform runs, but these should generally provide enough context about the topics discussed here.
gRPC.aws_vpc resource configuration). Full Documentation.AWS Service API Models use specific terminology to describe data and types:
The Terraform Language uses the following terminology to describe data and types:
Terraform Plugin Framework Schemas use the following terminology to describe data and types:
Terraform Plugin SDK Schemas use the following terminology to describe data and types:
ResourceData Go type.Some other terms that may be used:
TypeString (where "" represents not configured) and additional validation.0.0. Not implemented in the Terraform Plugin SDK, but uses TypeString (where "" represents not configured) and additional validation.0. Not implemented in the Terraform Plugin SDK, but uses TypeString (where "" represents not configured) and additional validation.For additional reference, the Terraform documentation also includes a full glossary of terminology.