12:00:47 #startmeeting Mer QA meeting 28/06/2012 12:00:47 Meeting started Thu Jun 28 12:00:47 2012 UTC. The chair is E-P. Information about MeetBot at http://wiki.merproject.org/wiki/Meetings. 12:00:47 Useful Commands: #action #agreed #help #info #idea #link #topic. 12:00:55 o/ 12:00:59 #topic Current status 12:01:21 #info Tools and tests development process updated to wiki 12:01:29 #link https://wiki.merproject.org/wiki/Quality/Development 12:01:51 #info will start work with doing Mer releases/builds in a way that can support automated QA and replicatable releases, finally 12:01:52 o/ 12:02:54 I would like to move the tools development process to somewhere else in wiki, now it is under QA 12:03:16 I haven't yet created the Category:About pages for QA tools 12:04:32 lbt, phaeron1: any wishes where we should have the tools development description? as an own page or part of the current tools page? 12:04:54 part of Tools I think 12:05:04 I don't have anything else 12:06:09 my plan with releases is starting to utilize the copy project patch within Mer CI quite soon, and do snapshot builds quite actively 12:06:13 which we can run with test .kses 12:06:38 #info Release of OBS 2.3.1mer-1 is underway 12:06:53 great 12:07:05 so far eat-host,eat-device is a nice combo 12:07:09 and hasn't failed me in demos 12:07:36 and testrunner-ui working fine 12:07:58 nod 12:08:11 so QA is truly a selling point 12:08:32 i also use eat-device as a nice way of demonstrating it's easy to prototype on device 12:08:36 as you can scp/ssh with ease 12:08:47 I would like to start using eat-host and eat-device , testrunner with my just finished vm based testing 12:09:34 it's very simple for now but should allow for testerunner based arbitrary testing of i586 vms 12:09:41 phaeron1: do you have a ks file for creating test automation vm image? 12:10:03 i've been missing smoke testing a lot with the recent release, so being able to run even the simplest of tests would be good 12:11:08 E-P: the vm to perform tests or the vm that will be tested :) 12:11:39 Stskeeps: we can have that as our main priority for now 12:11:57 even simple tests like "did this boot" would be good 12:11:58 phaeron1: vm that will be tested 12:12:05 as we're having new compiler, systemd coming in 12:12:31 E-P: basically any i586 ks with ssh and eat device added 12:12:43 phaeron1: + the eth0 setup 12:13:20 well right now I am using a big kernel that can do dhcp. so it is not HA dependent 12:13:32 virtio-net + dhcp 12:13:39 I am sure this will change 12:13:39 ok 12:14:04 but it was much simpler to not use the image's kernel , as it probably might fail to boot in a vm 12:14:07 not sure yet 12:15:36 anything else to add? 12:15:59 i've added a few task bugs for performance work as well, debugging, etc 12:16:03 will be in next triage 12:16:08 powertop, sp-* tools, etc 12:17:20 I would like comments on that HA thing 12:17:40 phaeron1: sounds good to me, tbh - if you can do it in a way that's equal to what is in mer kernel-headers.. 12:17:47 then that's a winner in my book 12:18:41 ie, kernel sources 12:18:53 what is the reason for kernel to do the dhcp, no possibility to config connman or something else? 12:19:11 Sage_: connman's notoriously difficult to do a static ip first it seems 12:19:20 and other side is that ping comes up quite quickly 12:20:07 wasn't there something called net console too? 12:20:11 yes 12:20:13 so we can have kernel messages from early on 12:20:27 ok, sounds good to me. 12:20:30 Stskeeps: I use a recent 3.x kernel built on cobs , we can sync it with mer core of course , it needs various virtio options and the builtin dhcp thing 12:20:34 http://www.mjmwired.net/kernel/Documentation/networking/netconsole.txt 12:21:38 we also need a new way of receiving syslog messages, i think 12:21:39 that will be useful tool for debgging 12:21:42 for testrunner* 12:21:55 nc 12:22:00 add it to tools 12:22:21 systemd journal is a new interesting player in this market 12:22:51 uh huh 12:25:51 ok, I think we can go on 12:26:44 #topic Test packaging 12:26:54 Let's start with the naming 12:27:15 in Meego the way was to use -tests suffix in the packages 12:27:46 i think it would be good to have similar also in mer 12:27:55 yes, for test binaries/libraries 12:28:26 what does test binaries mean? 12:28:41 I don't have a problem with that but I think we need to take the mapping out of the "just use package names" 12:29:32 Sage_: executables that are on the device/host that do the actual testing 12:29:40 e.g. there is package connman-test which has test scripts for testing connman functionality. 12:29:57 should that be also called -tests ? 12:30:07 yes 12:30:27 even if those have nothing to do with Mer QA actually, but is something released by upstream? 12:30:54 lbt: yes, like we planned some time ago, the mapping is more than just package names 12:31:03 Sage_: well. the only package is created by Mer packaging 12:31:16 they provide the tests, we decide how to package/use them 12:31:21 http://pastie.org/4165425 12:31:22 Sage_: yes, just having a constant naming in packages that provide tests 12:31:28 E-P: just making sure :) 12:31:54 lbt: yes, but I just want to know if the -tests is for certain kind of test files, or just any kind of tests written by anyone. 12:32:03 excellent question 12:32:28 what if we have upstream tests and our own tests in 2 source trees/packages 12:32:38 I would say that -tests package can have any kind of tests which are approved by mer qa team 12:32:52 connman-tests-* 12:32:52 +1 for E-P 12:33:22 what kind of tests we have atm.? 12:33:37 build time tests are not packaged so we can rule those out. 12:33:46 (they will be) 12:34:03 lbt: that is good question too 12:34:06 (we have a goal to split build-time tests out) 12:34:08 lbt: well in src package 12:34:34 lbt: we have, didn't know that. 12:34:38 should we have bluez-unit-tests and bluez-dbus-tests 12:34:53 Sage_: yes, it's a goal. The idea is to run the tests on devices, not build VMs 12:35:04 yep 12:35:55 E-P: I would prefer PACKAGENAME-tests-* syntax 12:36:10 but that migth be just me. 12:36:12 Sage_: we don't have much tests, mainly from meego and upstream 12:36:58 and you can have PACKAGENAME-tests[-*] so the -* are allowed but not mandatory. PACKAGENAME-tests is made from the package source 12:36:59 we have tests that are runned by testrunner, right? These tests require testrunner to be used. Then we have test scripts by upstream that user can call in shell but do not require testrunner or anything. 12:37:33 I don't think we should package all to one as that would just mean when user wants to have shell scripts to test he would have extra dependencies that he doesn't need. 12:37:33 the -* varieties can come from the main src (if you want to split them for some reason) 12:38:50 Sage_: test packages do not require testrunner to be used, the idea was that the testrunner's definition is separated from the test package 12:39:36 test package contains scripts/executables for testing some feature or function 12:39:54 would be cool if we had a common way to define "entry point" for a test package 12:39:57 * lbt cautions we're spending a lot of time on this ... PACKAGENAME-tests[-*] seems like a good compromise to support all requirements? 12:40:09 so automated systems know how to start the test.... 12:40:28 yunta: well, /usr/share/tests/testsuitename/tests.xml? 12:40:41 yunta: that is handled with test definition then 12:41:01 so xml is obligatory in every test package? 12:41:03 but it is not required to be in the test package, we can get back to this in mer channel 12:41:09 yunta: no 12:41:15 lbt: I agree with PACKAGENAME-tests[-*] however should we define some common things for [-*] part? 12:41:26 fine, let's discuss it later 12:41:37 good :) 12:41:57 Sage_: I suggest we defer that until someone feels the need to make one? Then raise it here? 12:42:05 fair enough 12:42:15 then we'll see why they want to do it.. and we'll have had more experience :) 12:42:32 yep, good way to do that 12:42:46 so -test packages should be renamed to -tests 12:43:05 yes 12:44:20 #info test packages must use PACKAGENAME-tests[-*] naming 12:45:46 #info [-*] suffixs are defined later 12:46:07 one question does QA need to approve all content of -tests ? 12:46:44 and are upstream provided tests scripts always considered as approved? 12:48:03 for the first one, yes. All tests should pass before they are accepted 12:48:32 for the latter one, I think we have to check them case by case 12:49:31 these are the guidelines at the moment, if we see that we have to change our process then we can change it 12:49:54 is there a concept of "expected result: fail" ? 12:50:24 and/or skipping a test 12:50:48 if using testrunner, you can define expected result 12:50:57 and you can use filter with testrunner to skip tests 12:51:07 E-P, one thing... how about tests finding actual fault in code 12:51:46 * iekku can't figure out why all tests need to be passed to get approved 12:52:42 iekku: we had many issues with failing test cases in Maemo, they never got passed 12:52:43 E-P: ok, then what if QA doesn't approve test scripts by upstream that would still be usefull where those would go? ;) 12:53:01 no need to answer just a though ;) 12:53:15 Sage_: that's why I think they should still be in the package 12:53:25 Sage_: heh, -tests-notapprove ;) 12:53:38 * Sage_ :headdesk: 12:53:54 but the test-suite (ie T-R definition) should not run them or rely on them 12:54:09 I think we have to discuss about this more detailed 12:54:33 E-P, and in meego also, i think 12:54:33 E-P: I think that we kinda permit anything into -test packages 12:54:52 but we manage carefully how we treat the results 12:55:14 so if a new test appears, TR should ignore it until it's "approved" 12:55:26 and what goes to release and package testing 12:55:40 lbt: yes, good idea 12:56:00 this lets Sage_ put anything into -tests 12:56:12 it also lets QA decide what they rely on 12:56:13 example: 12:56:29 yes with the test mapping 12:56:40 http://review.merproject.org/#change,624 <- might have additional test scripts, how would QA notice? :) 12:57:02 maybe better view: http://review.merproject.org/#patch,sidebyside,624,1,connman.spec 12:57:41 %{_libdir}/%{name}/test/* 12:57:49 Sage_: I know it's hard.... but you may have to have the developer talk to QA... :D 12:58:40 who is updating the package should inform the qa if there is new tests 12:58:48 E-P: that won't work. 12:59:09 E-P: *nod* ... ideally with a patch to the T-R definition 12:59:25 Sage_: why? 12:59:32 Sage_: any better solution? 13:00:02 approve all upstream tests scripts 13:00:41 that's not quite what was said 13:00:43 lbt: because that would actually mean that developer would need to test all those scripts and then ask QA and only after that do request 13:00:58 I didn't read it like that 13:01:28 Sage_: when someone reviews the 624 and see extra tests, they should see if they're now part of the testplan 13:01:57 we all agree they should be? 13:02:02 lbt: the point is that you can't see from the review if there are extra tests without checking the tarball and comparing it to the old one. 13:02:24 in my opinion upstream tests are not part of testplan 13:02:25 Sage_: yeah, unpleasant technicality though? 13:02:51 upstream tests aren't part of the testplan? Why not? 13:02:56 by default the upstream tests are not part of the test plan 13:03:08 after the review we can add them 13:03:12 ah 13:03:15 test case review 13:03:17 :nod: 13:03:34 sounds good 13:03:38 that really depends how the upstream tests are made 13:03:42 OK... upstream tests should be part of the testplan 13:03:48 if possible. 13:03:51 yes 13:03:53 it is not always. 13:03:59 so, let me continue 13:04:15 http://pastie.org/4165567 <- include that ;) 13:04:20 we all agree they *should* be ... subject to real world 13:04:25 sec Sage_ 13:04:52 :P 13:05:02 so when the 1.1 -> 1.2 packaging is done, it would be nice to also send a patch to the testplan 13:05:05 however 13:05:14 a couple of tests may fail or not be runnable 13:05:25 so they're marked as 'expected fail' or 'skip' 13:05:59 * Sage_ thinks we aim for the same thing but are talking about totally different perspective 13:06:02 the reviewer may either reject until that submit is done, or log a bug that it needs doing? 13:07:01 Sage_: what I'm saying is "package all new tests into -tests package and tell QA they're there" 13:07:19 then QA decide how to change the test plan, if at all 13:07:27 yes 13:07:30 a polite developer may submit a testplan patch 13:07:33 ok, well why didn't you say so at the first place :P 13:07:37 I tried 13:07:38 :D 13:07:51 so -tests can have test scripts that aren't part of the plan 13:07:55 yes 13:07:57 yes 13:08:03 * Sage_ is fine 13:08:16 the actual test plan is separated from the test package 13:08:29 where is that btw? 13:08:52 *mumble* 13:09:11 * Sage_ is not sure about all the terms. 13:09:17 Sage_: that is still under development :) 13:09:25 ah. 13:09:36 we had a meeting about that looong time ago 13:09:51 https://wiki.merproject.org/wiki/Quality/Test_processes 13:09:52 * Sage_ poors a bucket of water on top of lbt to avoid burning his brain. 13:10:05 way too late :) 13:10:06 anyway, we are now happy with the naming? 13:10:24 seems so - feels much clearer now 13:10:25 * Sage_ though we agreed on that already 13:10:29 #info test package can have not approved tests 13:11:05 * Sage_ is confused with terms "-tests" and "test package" 13:11:19 #info meaning failing or not ready test cases 13:11:36 Sage_: they are the same, but not the same as in Meego :) 13:11:50 did I confuse you a bit more? 13:12:07 or test scripts that are provided by upstream and are not part of test cases. 13:12:41 Sage_: tes cases = test plan == which of the test scripts provided by upstream we care about 13:12:52 and also our own tests 13:12:53 those should be in the -tests package, even if they are not test cases, we can use them as test cases later 13:13:02 100% 13:13:27 I have to go soon, so let's continue 13:13:35 lets continue and talk about this more when someone rejects my submits about the matter ;) 13:13:52 any wishes for the locations of the test executables and test data? 13:13:54 yep 13:14:08 Sage_: you meantioned long time ago that we should use /opt/tests/ 13:14:22 (or something like that) 13:14:38 can't recall exactly but I think that was the one I prefered at the time at least. 13:15:09 by location I mean where the -tests package is installing the executable files 13:15:28 * Sage_ *smiles* 13:15:48 what about the upstream stuff should we keep those where upstream wants or move them? ;) 13:16:04 eg. /usr/lib/connman/test/backtrace 13:16:41 I'd use SHOULD rather than MUST :) 13:16:47 I would say that not moving 13:17:02 yes, should 13:18:00 :) 13:18:08 how about for common test data, /usr/share/testdata/ 13:18:59 just for as a guideline 13:20:03 /opt/tests//testdata/ ? 13:20:17 and for test packages own test-definition to /usr/share// 13:20:45 /opt/tests//test-definitons/ 13:21:18 * Sage_ is pondering why not putting everythin under same dir that is related to testing? 13:21:43 if test package has test data that only the package itself uses, it should be under the /opt/ 13:21:58 I meant more like common test data that many test packages are using 13:22:14 common test data? 13:22:21 or should that be also in the /opt/tests// 13:22:25 video 13:22:34 E-P: yes 13:22:35 ah 13:22:43 I would say similar to everyting else 13:22:51 ok, fine for me 13:22:52 E-P: the problem with /usr/share is that you can't put arch-dependant stuff there, and imo it does not belong in lib as well -- putting everything in /opt/tests is not a good solution, but imo the best one 13:23:13 aard_: +1 13:23:28 aard_: thanks for the clearance 13:23:35 fine for me 13:24:21 +1 13:24:32 (additional benefit of the /opt is that we can easily do package checks. test-package and something not in opt? reject. non-test package and something in /opt? reject.) 13:24:56 #info test executables should be installed to /opt/tests// 13:25:29 #info common test data should be installed to /opt/tests//{audio video image etc} 13:25:48 #info test package's test-definition should be installed to /opt/tests//test-definiton/ 13:25:52 sounds like a macro to me %{_testdir} = /opt/tests/%{name}/ 13:26:12 *testsdir 13:26:37 not bad idea, can you file a task bug about that? 13:27:15 sure, not sure though if %{_name} can be used like that in macro but then just /opt/tests/ 13:27:33 like I said, we can change everyhing later if we see that something is not working well 13:27:49 anything to add? 13:28:41 if not, thanks for all 13:29:17 thanks guys 13:29:24 #endmeeting