JSON processing in Go provides robust tools for handling data serialization and deserialization. Let me share my experience with eight essential techniques that have proven invaluable in my projects.
Custom Marshaling transforms complex types into JSON representations. I frequently use this for date formatting and custom string representations:
type CustomTime time.Time
func (t CustomTime) MarshalJSON() ([]byte, error) {
timestamp := time.Time(t).Format("2006-01-02")
return json.Marshal(timestamp)
}
func (t *CustomTime) UnmarshalJSON(data []byte) error {
var timestamp string
if err := json.Unmarshal(data, ×tamp); err != nil {
return err
}
parsed, err := time.Parse("2006-01-02", timestamp)
if err != nil {
return err
}
*t = CustomTime(parsed)
return nil
}
When working with large JSON files, streaming becomes crucial for memory efficiency. I implement this using decoders:
func processLargeJSON(reader io.Reader) error {
decoder := json.NewDecoder(reader)
for decoder.More() {
var item map[string]interface{}
if err := decoder.Decode(&item); err != nil {
return fmt.Errorf("decode error: %v", err)
}
processItem(item)
}
return nil
}
Field omission helps control JSON output. I use struct tags to exclude sensitive data or empty fields:
type User struct {
ID int `json:"id"`
Email string `json:"email"`
Password string `json:"-"`
LastLogin time.Time `json:"last_login,omitempty"`
Settings Settings `json:",inline"`
}
Raw message handling proves valuable when dealing with dynamic JSON structures:
type Event struct {
Type string `json:"type"`
Payload json.RawMessage `json:"payload"`
}
func processEvent(data []byte) error {
var event Event
if err := json.Unmarshal(data, &event); err != nil {
return err
}
switch event.Type {
case "user":
var user User
if err := json.Unmarshal(event.Payload, &user); err != nil {
return err
}
handleUser(user)
case "order":
var order Order
if err := json.Unmarshal(event.Payload, &order); err != nil {
return err
}
handleOrder(order)
}
return nil
}
Number precision handling becomes critical in financial applications:
func handleNumericData(data string) (float64, error) {
dec := json.NewDecoder(strings.NewReader(data))
dec.UseNumber()
var v interface{}
if err := dec.Decode(&v); err != nil {
return 0, err
}
if num, ok := v.(json.Number); ok {
return num.Float64()
}
return 0, fmt.Errorf("not a number")
}
Map type conversion often requires careful handling of numeric types:
func convertMapNumbers(input map[string]interface{}) map[string]interface{} {
result := make(map[string]interface{})
for k, v := range input {
switch value := v.(type) {
case json.Number:
if f, err := value.Float64(); err == nil {
result[k] = f
}
case map[string]interface{}:
result[k] = convertMapNumbers(value)
default:
result[k] = v
}
}
return result
}
Validation during unmarshaling ensures data integrity:
type ValidatedUser struct {
Email string `json:"email"`
Age int `json:"age"`
}
func (u *ValidatedUser) UnmarshalJSON(data []byte) error {
type Alias ValidatedUser
aux := &struct{ *Alias }{Alias: (*Alias)(u)}
if err := json.Unmarshal(data, &aux); err != nil {
return err
}
if !strings.Contains(u.Email, "@") {
return fmt.Errorf("invalid email format")
}
if u.Age < 0 || u.Age > 150 {
return fmt.Errorf("invalid age")
}
return nil
}
Performance optimization through object pooling reduces memory allocation:
var bufferPool = sync.Pool{
New: func() interface{} {
return new(bytes.Buffer)
},
}
func FastJSONEncode(v interface{}) ([]byte, error) {
buf := bufferPool.Get().(*bytes.Buffer)
defer func() {
buf.Reset()
bufferPool.Put(buf)
}()
encoder := json.NewEncoder(buf)
encoder.SetEscapeHTML(false)
if err := encoder.Encode(v); err != nil {
return nil, err
}
return buf.Bytes(), nil
}
These techniques form a comprehensive toolkit for JSON processing in Go. I’ve found them particularly useful in building scalable applications that handle various data formats and sizes. The key is choosing the right technique based on your specific requirements around performance, memory usage, and data validation needs.
Remember to handle errors appropriately and test edge cases thoroughly when implementing these patterns. JSON processing can be tricky, especially when dealing with user-provided data or integrating with external systems.
The Go standard library’s encoding/json package provides excellent performance for most use cases. However, for extreme performance requirements, consider third-party packages that offer additional optimizations through code generation or alternative parsing strategies.