dataMine2 APIs and Remote Access

@smartloft - I suspect what’s happening here is that you haven’t increased the size of the array formula in Excel for the number of results you want. The array formula is in cells A1:B31 which means that it can only display a maximum of 31 data samples, any more than that will be ignored.

To display more results simply increase the size of the range, although if you are not familiar with array (or Control-Shift-Enter) formulas it might feel a bit tricky. For 100 results, click on cell A1 then select cells A1:B100, then click on the formula in the formula bar and then hit Control-Shift-Enter.

Also, your graph is using the default aggregations, Aggregation Period 0 and Aggregation Type 0, so if this is want you want in Excel you should chose these values.

Let us know how you get on…

@ConstantSphere - thanks, will give it a shot, thanks for the excellent work on Datamine 2

[quote=“ConstantSphere, post:21, topic:190679”]@smartloft - I suspect what’s happening here is that you haven’t increased the size of the array formula in Excel for the number of results you want. The array formula is in cells A1:B31 which means that it can only display a maximum of 31 data samples, any more than that will be ignored.

To display more results simply increase the size of the range, although if you are not familiar with array (or Control-Shift-Enter) formulas it might feel a bit tricky. For 100 results, click on cell A1 then select cells A1:B100, then click on the formula in the formula bar and then hit Control-Shift-Enter.

Also, your graph is using the default aggregations, Aggregation Period 0 and Aggregation Type 0, so if this is want you want in Excel you should chose these values.

Let us know how you get on…[/quote]

Hi guys,

I have been working on the API and have now managed to get it to work. At the moment I have a series of https links which allow me to access the lu_sdata remotely.
However the current process is a very manual one which requires multiple steps.

In short can you suggest the best programming language or software to use to automate this process. thanks

@carbonarchitect - can you give us a bit more background on what your set-up is and what you are trying to achieve. It will help with trying to make a recommendation. Thanks.

@ConstantSphere - I have three locations which each have a VeraEdge with sensors, TRVs, smart plugs, etc… I would like to use the lu_sdata data request to collect variable values remotely so that I can store them in one location, the desktop in my house, and analyse them on excel.

For the remote access I am using the MiOS phone app API which requires multiple steps of authentication before you can request any data from the box. This process of authentication is the same each time and I want to wrap it up into one package/program which will execute the process automatically.

All the step are https requests and I wanted to know if you had any opinions on which language I should use to run this set of requests automatically?

@carbonarchitect - thanks. Are the 3 remote boxes on UI5 or UI7? And do you need to have all the data in your central location or would it be OK to request the remote data as and when you need it?

@ConstantSphere - Our Veras are running UI7 at the moment. Since I messaged you I have succeeded in accessing the datamine log data. Thanks for your help on that.

As far as getting data my aim is to collect the data at least once a day from each of the datamine channels to build up the trends on my desktop without visiting the locations. I want to be able to set the times when the data is collected and otherwise have running automatically.

@carbonarchitect - going back to your original question, I’m not really in a position to recommend a programming language or environment for you as that totally depends on what you are familiar with. However, if you ultimately want the data in Excel you might want to try modifying the VBA code in the Excel spreadsheet I attached above.

I’m quite interested to know the steps you took to get the data remotely using UI7 - I understood the process is quite complex.

@ConstantSphere

How does one remove graphs (ghost) after the sensors had been removed?

Much thanks

@smartloft - to remove a ghost variable from an old sensor, simply click on the Configuration tab and click the Ghost Variables pane on the bottom left. Find the ghost variable you want to remove and click the red X to delete.

OK, so does DataMine 2 actually work ona Vera 3 (UI5) or is it only ported for UI7?

I can’t find this answer anywhere on MCV.

It should do. It’s written and tested on my Vera lite UI5. I don’t remember exactly which platform it was but someone mentioned that the USB mounting feature behaved oddly but otherwise ok. Let us know how you get on.

I will want to use it with an external mount. I suppose that means setting up a CIFS mount in my Vera 3 first before I can use DM2 this way?
Any problems with DM2 and cifs that you can explain please?

There should be no problem setting up a CIFS mount for dataMine2 to use. It is covered in the user guide available to download from here: http://forum.micasaverde.com/index.php/topic,35724.0.html.

Hi @ConstantSphere,

recently I have noticed that when you call data via the API, Datamine will give you readings at your start and stop time regardless of whether there is actually a data point. This means that there are often duplicate data points at the beginning and end of every data call.

Do you know why this happens and is it something that can be fixed?

Many thanks
carbonarchitect

Hi carbonarchitect,

Yes. It’s actually by design and intended to mimic the behaviour of the original version of dataMine.

Whether it’s desirable or not is another question and one that I wrestled with for some time. On the plus side it means that when displaying non-aggregated data you get a constant amount of data on the screen and can scroll through it evenly. As dataMine2 only records changes in data there could be long regions with no raw data in and something needs to be displayed.

On the minus side aggregated data can have additional data points. There are actually a lot edge cases and when you work through all the potential scenarios I decided on balance the points should be there.

If you ensure you call the API with start and end points that are on the exact multiples of your aggregation period then the extra points won’t be there.

Are you working with aggregations or raw data? What’s your specific use case?