Quantcast
Channel: Gurock Software Support Forum - General Discussion
Viewing all 685 articles
Browse latest View live

Feature Request: Activity steams in TestSuites

$
0
0

Hello,
Like many teams, we use testsuites to do our initial planning. We create tests in test suites, review them with the team, and as time goes on our testsuite (which typically maps to some feature) has more and more coverage.

It would be very helpful to have the activity tab in the testsuites view. This way managers and leads can see if we are actively developing/planning for a feature.


Feature Request - configureable actvity stream duration

$
0
0

Hello,
Currently, the activity steam in testrun defaults to 14 days. While this is some what helpful, it would be ideal if we have some control over this. Our single sprint is 1 month long and we only get half the picture because of this limitation.

I am mostly interested in seeing the graph which (hopefully) shouldn't be too much of a performance hit if the duration is longer.

The actual details of the activity can be paginated to avoid the performance hit.

Thanks

Ali

Feature Request - Clickeable pie chart in milestone view

$
0
0

Hello,
Milestone can be composed of many test plans. Similarly, a test plan can be composed of many test suites. It is very helpful to view the pie chart at both the milestone and testplan level.

As a manager, when I see that there are 10 blockers in a particular milestone, I would like to view them as quickly as possible so that I can work on unblocking these issues.

It would make sense to be able to click on this blocked piece in the pie chart and TestRail should show me all the blocked tests. Currently, the 10 blockers in a milestone are spread across various test plans and I have to go to each test plan and then go to some testsuite within that testplan to finally view the blocked testcase.

Would be great if this can be part of the next release.

Losing test result edits...

$
0
0

(I thought this was reported already, but I can't find the entry - so sorry if this is a repeat).

We've been using TestRail now for about 6 months and really like it in general.

However, I've had several instances where folks are losing their test results information.

It's not because if a bug - it's more a usability issue.

What happens is a tester will open a Comment or Test Result entry and start putting in their results in as they are testing.  Most of the time, they will periodically click on "Add Test Results" which will save their current edits and close the popup.  They'll then either edit that same entry to continue, or create a new entry.

Sometimes though, some time will pass before they save, or they will forget that there's no AutoSave and click away from that page. 

In either case, they lose their edits.

It would be really helpful to have either an autosave, or some other kind of easy way to save the existing edits and be able to continue on.  It would certainly save me from hearing the uh... colorful langauge I heard this afternoon when one of my guys lost his entire day's worth of work. roll

Editing Permissions

$
0
0

Is it possible to edit/add to the list of permissions available? 

In particular, I would like to restrict our testers' ability to move test cases, but unchecking the Add/Edit Test Cases & Sections permission is too restrictive for what we need.

Neil

Custom Field for Test Results alignment

$
0
0

TestRail Help says:

Depending on the custom field type, custom fields are either placed on the left (for large fields) or on the right (for small fields) of the result dialog.

However, it would appear that this is defined by the "Type" of customisation as I have a Multi-select custom field in my Test Results that I want to appear on the right, but it is appearing on the left despite each of the selectable items being of short length. 

Is this correct?  If so, could you list which types are deemed "large fields" and which are "small fields"?

Regards,
Neil

Feature Request: View user activity

$
0
0

It would be great to be able to view user activity like
- activities in testsuites (test added, modified, etc) over some time span
- activities in testrun/testplans (pass, fail, blocked posted) over some time span

Thanks

Ali

bug: Export as Excel exports as CSV


API: Post error

$
0
0

I'm trying to update a case status via Testrail-API HTTP  POST request (Using Fiddler2)

Request:

POST /testrail/index.php?/miniapi/add_result/1&key=123456&status_id=5 HTTP/1.1
Accept: application/json
Content-Length: 0
Content-Type: application/x-www-form-urlencoded
Host: localhost

JSON Result:

error=One of Status ID, Assigned To or Comment is required
result=False

Any ideas what I`m doing wrong ?

Thanks!

Test Plans setup with multiple Test Suites

$
0
0

In TestRail, we have setup a project that has multiple Test Suites. We currently have our Test Plans setup to run two test suites (one titled "basic" and one titled "custom"). Using this setup, for each Test Plan, we selected a number of sections under the "basic" Test Suite and a number of sections under the "custom" Test Suite. Each Test Plan doesn't require ALL tests to be run from the Test Suite – we selected only the Test Cases/sections that are relevant to that Test Plan.

However, we have run into an issue when maintaining these Test Plans.

Question: When we add a new Test Case into the Test Suite "basic", the new Test Case does not show up in the Test Plan. Is there a way to do this without selecting all the Tests from the entire Test Suite?

Feature Request: Clone TestPlan Suites

$
0
0

Hello,
I have a TestPlan called 'Login Feature TestPlan'. I also have a testsuite called Login TestSuite which has all my login tests.

When I create the TestPlan, I add a TestSuite to the plan and rename the suite to 'FireFox - Tests'. I then add the same testsuite again and this time rename it to 'IE 8 - Tests'.  This way I do this for multiple browsers.

The issue with this is that each time I have to figure out the set of tests by applying filters or manually selecting the tests. This becomes tedious and somewhat error prone.

Once I have figured out the set of test for one suite in the testplan, I would like to be able to clone the same selection and create another suite.

The configuration option doesn't quite work for us due to API limitation and some other issues already discussed with the TestRail team.

Thanks

elapsed field datatype - API question

$
0
0

Hello,
In the POST for
add_result or add_result_for_case, we can pass in elapsed. What is the datatype for this field? I am trying to pass the time/duration from our automated junit reports to TestRail so that we start using TestRail's forecasting feature.

The time field in Junit reports is a double. Example

<testcase classname="msipackage.test_splunkinterop.TestSplunkInterop" name="test_interop_64bitForwarderTo32bitSplunk" time="1885.76499987"/>

Can I pass the the time to elapsed as is ?

Ali

close_run API method is unknown?

$
0
0

Hi.

I'm trying to close test runs with the close_run API call. But I keep getting this:

{u'result': False, u'error': u"Unknown method 'close_run'"}

TR version is v2.7.0.1978 and as I understand this method was added in 2.6...

Any advise?

Thanks.

Plugin "Redmine" returned an error. Invalid HTTP code (403).

$
0
0

Hi guys,

For one particular Project in TestRail, I'm getting the following error message whenever I try to push a defect to Redmine:

Plugin "Redmine" returned an error: Invalid HTTP code (403). Please check your user/password and that the API is enabled in Redmine.

This is happening whether I use the global TestRail integration setup for Redmine, or if I configure the integration specifically for this Project.

All other Projects, irrespective of them using either a Project-specific Redmine integration or the global one, work as expected, and I'm able to push a defect directly to Redmine.  It's just this one Project that has the issue.  The only thing we can think of is that this Project was created before the Redmine plugin/integration was configured.

Any thoughts?

Neil

Feature request: Cross-configuration status quick view

$
0
0

We often have scenarios where we have multiple runs occurring in a plan through different configuration options.  Often if something has failed in Firefox, we need to investigate other browsers as well, which are all segregated by configuration options.  Similarly, if something has failed in one environment, we need to compare it to another environment (again, which may be a config option).

From a test result, it would be useful to provide a quick view of the status of the same test in other configurations or runs.

Thanks


3 Feature Requests: Default Milestones; Suites; Import Test Runs

$
0
0

1. When creating Milestones there should be a way set one milestone to the "default" milestone. Then entering Test Plans and Test Runs it would be select the "default" milestone automatically.

2. When entering Test runs in a project where there is only one Test Suite it should not ask you to select the Test Suite, since there is only one Test Suite to choose from. It should skip that and select the single Test Suite by default.

3. You should be able to Import Test Runs. Give it a Test Run Name, Milestone, Description, Assigned Name and Test Suite(s). The same XML syntax like importing Cases works really well. This would be a huge time saver.

Thank you

Getting an error from miniapi add_result

$
0
0

I'm attempting to post test results using the miniapi add_result method, but I'm getting the following error:

{
    "result": false,
    "error": "[Microsoft][SQL Server Native Client 11.0][SQL Server]Conversion failed when converting the nvarchar value 'api' to data type int."
}

The post looks like this:

http://sea1testrail01/testrail/index.php?/miniapi/add_result/1258&key=lsdkf90234kjlkjas999893824

Post parameters: status_id=1

I've successfully gotten results back from the miniapi for other GET and POST calls, so I'm not sure what to make of this.

Any ideas as to how to fix this?

Feature Request: adding existing testplans to milestones

$
0
0

Hello,
We have quite a few test plans in TestRail. As we move on to new milestones, we have to add a good number of these testplans to the new milestone.

Currently, I have to go to each testplan and then add it to the new milestone. Doing this is for 30+ testplans every few weeks is a lot of work.

Is it possible to add a new feature to add existing testplans to a given milestone. The 'edit' feature of milestone should show all the available testplans and we should be able to just check away the ones we want to add to the milestone.

Needless to say if this can make it to the next release, we would be very happy smile

How to prioritize the testsuites in a testrun?

$
0
0

Hi All,
Was wondering how can I setup multiple testsuites in a testrun and prioritize them?

For example my TestRun has TestSuite A, TestSuite B and TestSuite C .
How do I assign them to my testers so they run TestSuite C, TestSuite B then TestSuite A in that order?

New to this Software so any help would be greatly appreciated.

Thank You

Search on Testcases that belong to Milestones

$
0
0

Is there a way to identify a list of testcases that are assigned to a specific Milestone?

Viewing all 685 articles
Browse latest View live