Improving Edgesense

Calling @melancon. Can I use UBx expertise to make improvements on Edgesense? Since we are considering building it into a full-fledged semantic tool, we might as well improve also its “basic” version, the one that sits on the website and updates in real time. Let me know, in case we’ll schedule a call for this.

Yeah !

I had missed this one, thanks for bringingme in – through email … :frowning:

Yes, I want to work on this. I am happy that you bring this up. I need input from users to guide the design of the dashboard. You know my methodology:

  • talk to me about your work, what you do and what you seek for;
  • we work together to see how a visualzation and interactive manipulation and/or computations can support your tasks;
  • I go and build something combinig ideas from the literature and designing new stuff.

We iterate over this, refining the domain questions (we can support), refining the specifications of the dashboard (visual encodings, interactivions).

@Alberto How should we organize it?

@Noemi (?) (who else?)  All potential users and interested persons are invited to jump in a participatory brainstorm.

1 Like

A dedicated call is in order

I have an initial idea. We could go through it in a call, and see what and when is feasible.

The big choice is:  what is dashboardy (more standardized, automatic updating, more suited for “watching ourselves collaborate” as Guy says in the video) vs. what is custom (analytically sharp, batch computation, more suited for deep scientific analysis).

Friday 29th OK?

1 Like

Dang.

I guess that call is over. :frowning: @melancon

If you’d like to smell down my preferred alley a little (I will usually give in pretty quickly when you raise cost/benefit - don’t worry), I love to throw my ideas at you. If you happen to understand German you can also have some of the discussions/monologues I’ve had around the issue. I think I agree very much with Alberto - perhaps the largest tension is in the perceived urgency to include “non-text material” (audio, images, video). If you’d like to discuss, hit me. Everyone else is invited too, of course.

I’ll be travelling a lot till Friday though, so I propose to watch this video a little bit, it is probably a lot like the third piece of the puzzle. https://youtu.be/pHjeFFGug1Y?t=3126 , it won’t hurt to jump back to the beginning (a couple of times - if you don’t know cynefin yet). I just wanted to start closer to somewhere it may pay off for Op3nCare as well a little.

Dang?

Not sure what this means – if you mean you are sorry you missed the call, then I am as danged as you are :slight_smile:

No german here. Try my Italian :slight_smile: I’ll have a look at the video and will post back.

The call

This is a summary of the decisions made in the call: https://edgeryders.eu/en/opencare-research/developing-the-on3ncare-conversation-dashboard-notes

Uhm. I am not a fan of Cognitive Edge. Anyway, I’ll watch.