Redux: Question: Normalizr usage with nested objects in React/Redux

Created on 20 Jan 2016  路  15Comments  路  Source: reduxjs/redux

My data models look like this:

user
message has sender: user and recipient: user
conversation has messages: arrayOf(message) and lastMessage: message

And UI structure is _roughly_ this:

<Messenger>
  <ConversationList conversations={conversations}>
    {conversations.map(conversation => (
      <ConversationItem conversation={conversation}>
        <Message message={conversation.lastMessage}>
          <UserAvatar user={message.sender} /> {message.body}
        </Message>
      </ConversationItem>)
    )}
  </ConversationList>
  <Conversation conversation={selectedConversation}>
    {selectedConversation.messages.map(m => (
      <Message message={m}>
        <UserAvatar user={message.sender} /> {message.body}
      </Message>
    ))}
  </Conversation>
</Messenger>   

(tl;dr convesation list, each item shows last message and selected conversation shows list of messages. Each component is in a separate file)

When fetching all entities go to to the entities reducer as expected. Relationships turn into IDs.

I want to reload a component if any of its nested objects have been updated, so if some user has its avatar updated, a <UserAvatar> inside a <Message> anywhere should update.

Question:

Where does it make sense to do denormalize data (using @connect) and minimize unnecessary updates?
I can do it in in each component individually, but essentially that would make all my components @connected.

@connect(
  (state, ownProps) => {
    return {
      // ownProps.user can be an ID, that would fetch from entities
      user: typeof ownProps.user === 'string' ? state.entities.user[ownProps.user] : ownProps.user
    }
  }
)
export default class UserAvatar {
  render() {
    return <img src={this.props.user.avatarUrl} />
  }
}

Is it a good approach? Would it scale with many flowing updates?

question

Most helpful comment

Hi, @gaearon, I was expecting for the @elado's denormalize PR, but I have found the denormalizr.

All 15 comments

Not specific to Redux but here's the approach I recommend.

Let every component receive _the data it uses_ as props, but retrieve _the data only children need_ by using connect(). This is a very nice tradeoff and I've found it pleasant to work with.

This is also a duplicate of #815.

For what it's worth: I'm prototyping my pieces for my first React/Redux app right now, and I have a use case where the opposite seems to work well. In particular, I've got multiple types of items that I need to represent in a treeview, that can be nested as: A>B, A>C>D>B, A>E>B. A is basically just a containing "folder" in the tree, B/C/D/E are actual entity types. My initial implementation has each parent rendering tree nodes for the children and passing in IDs, and each child has an appropriate mapStateToProps to pull in the entity it cares about by ID, plus some related data. Since a given normalized entity already has just the IDs of its children, and individual nodes need to update separately (say, drawing a highlight if it's the currently selected tree node), this seems to be working out well for me.

Again, still a newbie and still just prototyping, but tossing this out there as a counter-example.

The problem with passing just an ID is it's hard to render arbitrary data. In some cases it can be useful to render the "draft" or "fake" state instead of what's currently in the store by the given ID. If you want to render empty states in list of items, you can't, because you don't know reliably in the parent component whether the data for the corresponding IDs has been fetched. It's also harder to test components that require IDs. Finally it's harder to create something like a "visual palette" of components because instead of passing static fake props you have to prefill the store.

So not saying it's inherently bad but there are downsides to always passing IDs.

Thanks.

I also don't like the IDs as properties.
However, making sure each parent denormalizes the data for its children is also a lot of work.

Before integrating Redux, my models just had definedProperty with the relationship (I use js-data for models), so it was easy.

Maybe normalizr should have a denormalize method that uses properties and reads from some collection (e.g. redux state from the entities reducer), so it doesn't run into infinite loops while denormalizing, nor reading entities unless requested, lazily, and also keeping everything flat and immutable in store.

I'll sketch something.

I don't think preemptive denormalization is a good approach. You'll likely end up recreating many objects that haven't really changed, thus killing performance optimizations at the component level. Creating objects gradually as components need them is likely to be more performant because you'll utilize existing performance optimizations at the component level and won't recreate more objects than necessary.

@gaearon

It can be something like:

let conversations = state.messenger.activeConversations
  .map(c => state.entities.conversations[c])
  .map(c => denormalize(c, schema.conversation))

const denormalize = (entity, schema, bag) => {
  const instance = { ...entity }

  for (let relation of /*schema realtions*/) {
    let relationId = entity[relation.key]
    if (relationId) {
      Object.defineProperty(instance, key, {
        // TODO: memoize value, handle arrays, handle polymorphic etc.
        get() { return denormalize(bag[relation.type][relationId], relation, bag) }
      })
    }
    else {
      instance[relation.key] = null
    }
  }

  return instance
}

When denormalizing a conversation only a single object is created, with properties. If a property is accessed only then it fetches the relationship lazily and denormalizes it too. Returned can be cached as long as the entity in the bag didn't change, so no extra denormalization happen each time property is accessed.

What do you think?

If you can make it work efficiently please feel free to make an attempt (either inside or outside normalizr).

@gaearon
Initial commit, please take a look: https://github.com/elado/normalizr/commit/35c8acb770c473e886296c8a7f9c03f502a9c167
It needs to be inside normalizr for the schema types references which are not exposed. I'm also thinking about changing the way the relationships are saved in the schema, because now I need to skip all the keys that start with _ in order to get the names.

Before making a PR I'll try it in my own app to see performance and usage.

Yeah, this would be best with an example proving shouldComponentUpdate() optimizations still work good and that large updates are efficient.

@gaearon , just took a look at your new tree example, and saw you'd chosen to implement it by passing IDs to the child components. I'm a bit surprised, given your comments here just a few days ago. Your comments seemed to suggest that you had reservations about that approach, although maybe I'm over-interpreting what you said.

Still, nice example, and thanks for taking the time to put it together.

Haha yeah I was just about to check in again in this thread :-). So normally I'd use the approach I suggested but it's harder to make it performant when there are thousands of entities and arrays are involved. So it's a tradeoff of performance for convenience.

So far, with hundreds/thousands of entities my approach seems promising :)
No tradeoffs -- root container denormalizes root entities and all other components remain dumb.

Any updates on this?

Why would there be any updates on this? What kind of updates do you expect?

Hi, @gaearon, I was expecting for the @elado's denormalize PR, but I have found the denormalizr.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

nickknw picture nickknw  路  71Comments

pe3 picture pe3  路  50Comments

gaearon picture gaearon  路  61Comments

wmertens picture wmertens  路  55Comments

acdlite picture acdlite  路  54Comments